The Trajectory of Mental Health Services

Tegethoff, M., Stalujanis, E., Belardi, A., & Meinlschmidt, G. (2014). School Mental Health Services: Signpost for Out-of-School Service Utilization in Adolescents with Mental Disorders? A Nationally Representative United States Cohort. PloS One, 9(6), e99675. doi:10.1371/journal.pone.0099675


This article, “School Mental Health Services: Signpost for Out-of-School Service Utilization in Adolescents with Mental Disorders? A Nationally Representative United States Cohort”, looked at how school mental health service providers, such as school psychologists, serve as a guide to out-of-school medical or mental health providers (Tegethoff, Stalujanis, Belardi, & Meinlschmidt, 2014). They used data collected from the National Comorbidity Survey Replication Adolescent Supplement (NCS- A), which is an incredibly in-depth study of children and adolescents with mental illnesses completed between 2001 and 2004. The NCS-A study included data from interviews with the child/adolescent, rating forms completed by the child/adolescent and a parent/guardian, and detailed demographic information. This data is available for other researchers to use to complete their own research. For this study, lots and lots of statistical analyses were used by the authors on the data collected in the NCS-A study order to make their final conclusions.  Overall, the researchers found that school-based mental health services due typically guide families to out-of-school medical service providers, such as pediatricians or emergency rooms. Less often, school-based mental health services guide families to mental health specialists, such as psychiatrists or outpatient mental health clinics.

What was interesting to me, and what I’m still having a hard time understanding, was the “hazard ratios” computed by the researchers. I was able to learn that a hazard ratio is a statistical analysis that tells the amount of time between Situation A and Situation B happening. The statistical analysis did not necessarily look at whether or not (for example) school psychologists encouraged parents to go to the pediatrician or psychiatrist, but how long it took before the family sought those services. (At least, I think that’s what it was saying!) Also, despite my cursory knowledge of statistics and research on hazard ratios, I was unable to understand what a hazard ratio (HR) of 1.17 or 3.15 really meant. I did figure out that HR=3.15 takes about 3 times longer than HR= 1.17, but I can’t turn that into raw numbers. (I mean… three times longer than what? What does 1 equal? A week? A month? Some theoretical amount of time that is never really determined? Argh!)  The article doesn’t really explain it, though it may be because their usual audience would already know.

The authors organized the article in an easy-to-read format, and I really appreciated that the tables were embedded into the article instead of attached as appendices. I think it just makes it a lot easier to read that way. I enjoyed how they used end notes and not parentheticals, too – I know it’s just a personal preference (and I have made my peace with APA style, since that’s the preferred format for educational research), but I do think end notes allow for more fluid reading. There was not a specific literature review included, though the authors did reference many studies previously conducted to support their reasons and findings along the way.

Some of the data reported in the tables was interesting, but I’m not sure how necessary or useful it was. For example, I found it really interesting to see that nearly 50% of the children and adolescents in the total sample cited 3 or more siblings, and that over 35% of the total sample were first-borns. Though I am not a fan of birth-order theories, I had a mental jaunt about this. Does a bigger family simply increase your chances of at least one child having a mental illness? Are first-borns more susceptible to mental illness? Or is it just that parents need more support recognizing mental illness in their first-born, and are more likely to seek outside services for later children without the school initially intervening? Interesting questions – right?? Well, they are to me. But even though this information is given in the article, it was not referenced anywhere in the text. I’m a fan of visuals, but only when the pertinent information is provided and explained within the article. Otherwise, it just appears to be fluff and can overwhelm the reader.

According to the article, “this is the first comprehensive study of the role of the school mental health sector as a guide to mental health care in out-of-school sectors” (Tegethoff, Stalujanis, Belardi, & Meinlschmidt, 20149, p. 6). It did not explore whether school mental health providers caused students to seek outside mental health support, but did look for a temporal relationship (i.e. that the school-based referral came before seeking outside help). The next step would be to look at causality. The authors note that this will be really difficult to do, though, because of the size of the sample needed to make it representative of the entire population.

In some ways, I was pretty bummed to realize this article didn’t explore causality. I mean – was this really necessary to prove? Did anyone NOT think the school-based referrals came first?? But then I remembered one of the basic tenets of research the way it exists in the world: if it hasn’t been proven, then it can’t be used to support your theory. I also thought about how a handful of my students who I don’t realize have any mental illnesses until their psychiatrist sends a script requesting testing or a 504 Plan. So there are times that the school is not the first one to find a problem.

Additionally, even though this information cannot be used to show causality, it did have interesting findings. It implies that school psychologists and other school-based mental health professionals are doing well at referring to pediatricians and emergency rooms, but less well at connecting directly with community-based mental health service providers such as psychiatrists and therapists. I would say I find that to be true in my district. We (school psychologists) shy away from making recommendations about medical diagnoses because we are not doctors. I think it’s because of this that we are also wary of sending parents to mental health providers and instead suggest they check our results with their primary care physician first. And when we have a child in extreme crisis, we tend to recommend the emergency room to families and not the psych ward. I’m not sure how much is due to our desire to be “PC,” how much is due to a hesitancy to suggest medical diagnoses, and how much is pragmatically due to insurance requirements (i.e. many insurances need a formal referral from a primary care physician before paying for specialist services like a psychiatrist).

As I’ve mentioned before, I would love to see school-based mental health service providers working together with medical professionals. I would love to have wrap-around services provided within the school, including access to fully licensed psychiatrists and therapists. I think school psychologists have more training and expertise than we sometimes give ourselves credit for, and I want us to be part of a system that supports us using that expertise. This article helps clarify where we  land in the overall mental health trajectory for students, and I think it can speak to what our next steps should be (better awareness and better connections with outside providers).

Finally, on a completely separate note, I have recently read many articles about “outsiders” doing research on a particular culture or society, and how that framework is inappropriate. I have struggled to really understand the problem with it, because the “outsiders” have primarily been people who I identify with. But this study was completed by Swiss and German researchers on U.S. children and adolescents. Why? Why wouldn’t they study their own countries’ mental health trajectories (especially since they probably have a very different medical system than the U.S., so results would not be generalized to their countries)? Why wouldn’t U.S. researchers have been the ones to spearhead this study, or at least be involved?? Overall, I don’t see anything in the results that seems like they would be tainted by the nationality of the researchers, but I still questioned why. Ah… perhaps this is a small example of what those indigenous cultures have experienced!

Race and class impacts instructional decisions

Mertzman, T. (2008). Individualising scaffolding: Teachers’ literacy interruptions of ethnic minority students and students from low socioeconomic backgrounds. Journal of Research in Reading, 31(2), 183–202. doi:10.1111/j.1467-9817.2007.00356.x


The article, Individualising scaffolding: Teachers’ literacy interruptions of ethnic minority students and students from low socioeconomic backgrounds, discussed a group of teachers’ scaffolding techniques of interruptions during literacy instruction. Interruptions from teacher to student are a common practice, especially when getting students to learn how to read, but little research has been done to analyze the types of interruptions teachers implement, to which specific students that teachers are interrupting as well as what effect interruptions have on literacy development. Before the study was conducted, participating teachers were interviewed to determine their beliefs and approaches around literacy instruction. The study showed, however, that teacher beliefs often contradicted or conflicted in some way with actual instructional practice. A key finding is that with ethnic minority students, teacher interruptions were more frequent and were more often related to a phonics or accuracy issue instead of an issue related to reading comprehension.


A particularly helpful heading was “Research on teachers’ literacy interruption” (Mertzman, 2008, p. 187) because it allowed me to quickly go to that section to see what is out there in terms of this topic. There is not a whole lot of research out there, but something the author did write in this section that I was not too surprised about was that students in lower ability reading groups were interrupted more than students in higher ability groups (Mertzman, 2008). This implies, however, that if a student is in a lower-ability group, they will end up reading less on a daily basis than those students who are in a higher ability reading group. This made me quickly see the relevance of this issue to not only instructional scaffolding, which is the breaking down of concepts in order for students to learn, but also to the hot topic of ability tracking. This caused me to ask, if students read less in lower ability classrooms, does this mean that we should discontinue ability tracking? It’s a complex question, but one that I began to think about right away as result of this section.

Another helpful heading was entitled, “Overall patterns of teacher interruptions: more focus on word recognition than on text meaning” (Mertzman, 2008, p. 190). This told me right away that teachers cared more about phonics than comprehension. In other words, in this study, it was found that teachers felt that it was more crucial that kids could read the words accurately than be able to understand what they actually mean. This made me think of a concept that I am learning in my human development class, in that kids at this age are cognitively able to realize that words represent concepts, so it is crucial that we focus on both the pronunciation and meaning when kids come across words.

Contribution to the Field

One major contribution to the field of early childhood literacy and instruction is the identification of the types of interruptions that are implemented in classrooms. This allows early childhood educators to discuss these types of interactions with colleagues in order to be cognizant of them and improve upon them. The types of interruptions that teachers implement are: “student or teacher model, scold, praise, repeat answer, explain the right answer, focus on meaning, focus on word recognition and sounding out Convergent questions” (Mertzman, 2008, p. 191). Knowing what these interruptions look like in practice will allow us to study them more in the future, especially as there are positive interruptions that provoke student academic achievement and those interruptions that hinder it.

Data Collection Methods

This study was conducted through examining four different classrooms within the same school closely. It was made clear to teachers that the point of this study would be to analyze interactions between teachers and students, but interruptions were never mentioned in order to avoid the problem of participants being self-conscious about these types of interactions. Once teachers were selected, each teacher was observed for two entire days of instruction in order to provide context for student behaviors throughout the day. Then, the period of class that was exclusively devoted to literacy instruction was filmed. Immediately following the literacy period, the researcher interviewed both teachers and students that were interrupted. The filmed segments were then played back to the interviewees in order to get a sense of what thoughts and feelings the participant had behind that interruption. Then, transcripts were consulted in order to begin the data analysis process and the identification of types of interruptions occurring.


The key question that the author was trying to answer was whether literacy interruption patterns were different with students from different races/economic classes (Mertzman, 2008). Unfortunately, the findings were that yes, they are. In the interviews conducted before the study, teachers never once mentioned socioeconomic status or race as a means to individualize instruction. However, as ethnic minority students were more likely to be interrupted than their white, higher income peers, it seems that teachers do in fact consider race and class as a factor when making instructional decisions. Additionally, the fact that the interruption types were more likely to be a word recognition/phonics issue does not support a balanced approach to literacy (Mertzman, 2008).

New ideas this study suggests for my area of interest

The author made it clear that interruptions can be a powerful force to effectively scaffold a child’s instruction. As this study identified interruptions that would foster a balanced approach to literacy, I began to think of cues that could be taught to teachers during professional development. I thought how when I go and observe teachers, I can specifically focus on the interruption types, the frequency of them and to whom they are being given in order to come up with appropriate suggestions for instructional improvement. I also thought about how we can connect positive interruptions to the idea of helping students manage their uncertainty within the context of learning how to read.

Further study

It is important to note, as disturbing as the results of this study are, that this was a very small-scale study. Only four teachers were studied and the school was in a rural area of the Southeastern United States. Therefore, to get a better sense as to whether race and socioeconomic status impacts teachers’ literacy scaffolding, larger studies in more diverse settings should be executed.

A top-down approach to retention

In many ways and for a variety of reasons, I’ll take a top-down approach to research when looking at ways to improve retention rates amongst Hispanic students at private, Catholic and prep schools. One of these ways will be reading articles centered on retention rates of Hispanic students at post-secondary schools. First off, there is more research out there on supporting Hispanic students at colleges than there is at prep schools. And secondly, if I can discern ways to support Hispanic students succeed in college, certainly many if not all of the methods utilized could be used to support Hispanic students at college prep schools as well. I think that the tangential relationship between retention rates at both of these types of schools is correlative, especially since one type of school builds to the other.

I found one such article by G.M. Stern (2008) on Mercy College, a small college in up-state New York, and its attempt to recruit and retain Hispanic students. I’d like to think that an article about Hispanic college students in New York is relevant to my findings in Phoenix because I’d love to send my high school students back East for college. I think there is value to leaving one’s home state and seeing who you become away from home. One of my best Hispanic students is leaving us this fall to study engineering at Swarthmore on a full scholarship. He’s someone I’ll be interviewing for my research as well.

From Stern (2008), I found it interesting to learn that, “Of all undergraduates enrolled at Mercy, Hispanics comprise 29 percent of the student body, more than double the percentage of Latinos in the U.S. population. Last year, U.S. News & World Report named Mercy one of the seven most racially and ethnically diverse colleges in the North” (p. 2). This college in New York seems like one to keep an eye on because it is obviously doing something right when it comes to recruiting and retaining Hispanic students. One thing Mercy seems to do well is to do a lot of discernment of which students will do succeed at their college prior to students even enrolling. Stern (2008) has written, “The college is seeking students that fit Mercy regardless of their ethnic background” (p. 1). This suggest that Mercy has a strong of sense of self, knowing which students will succeed at their college because they’ve reflected upon this thoroughly prior to their enrollment process. This sounds like an important first step as schools try to better retain Hispanic students on their campuses: know thyself, to borrow a phrase from the ancient Greeks.


The author’s article was organized well and easy to follow. The headings were helpful, especially ones like “Retention Strategies.” I also found it helpful that some of its headings guided me to Mercy students where the article discussed how the students found success at the college. For instance there was Karen Quijano. Stern (2008) wrote of Quijano, “She didn’t apply to large campuses with 30,000 or so students because she considered the numbers too intimidating” (p. 3). I think that Quijano’s example goes back to Mercy College’s front-end recruitment of students they know will have success at their college. There doesn’t seem to be a point to recruiting in bulk. Rather, be it secondary or post-secondary, it seems best to really do research on the students early on in the enrollment process.

Literature Review

Stern’s article was not a bastion of well-researched theory on Hispanic education as it pertains to Mercy College, so in this case, I found his literature review somewhat lacking. Stern’s research centered on Mercy College itself, interviewing faculty, students, and administrators on the things it does well. It wasn’t just that, though. Stern also analyzed data relating to Mercy College in the context of the state of New York and colleges in general. Still, I didn’t find the depth of theory regarding why Hispanic students might do well at this college. I’m not sure if this is a negative thing or not. I plan on discussing ideas like indigenous methodologies in my research paper, but I also would like to have some “boots on the ground” data so to speak regarding tangible information from schools that are successful in recruiting and keeping Hispanic students.

Data Collection

As intimated in the previous section, Stern’s data, for the most part, comes from the analysis of college retention numbers at the school, data related to other colleges in New York and back East on the whole. Stern (2008) wrote, “Despite all of its efforts and successes, Mercy College can’t retain all of its Hispanic students. Though 29 percent of students enrolled are Latino, only 18 percent of its graduates are Latino. Why don’t the other 11 percent graduate?” (p. 5). After presenting this information, Stern went on to analyze the results and answered his own question by anecdotal evidence and interview data.


Stern did a nice job of allowing others to do his analysis for him. Just as I previously mentioned, the author asked questions like the one above and used more knowledgeable sources to provide answers. Stern (2008) wrote here, transcribing the thoughts of Carolyn Tragni, Mercy College’s assistant vice president for academic support, “’Some Latino students come in with barriers. They’re working to support families, sending money back to families, coming out of high schools that may not have prepared them for college – and for some, English isn’t their native language. We struggle like any college to retain a higher number of students’” (p. 11). The data provided by Stern supports that Mercy College is doing a good job of retaining its Hispanic students. What Tragni relates here is that each year attrition will happen with Hispanic students. I’d like to not accept this answer, and I’d like to find methodologies to offset some of the things that Tragni mentioned in this article as hindrances to Hispanic students finding success in college.

Theoretical Framework/Lens

Stern wrote his article as a researcher and inquisitor. I feel that he wrote as a reporter as well. Reporters tend to be devoid of opinions and simply want to parlay facts to readers. I found that here to some respect. His writing style does seem to distance himself from his readers. Still, the subject matter and research shows that Stern cares about the topic – or else he would not have taken the time to write about this college and its success with regards to the issue of Hispanic education. Reporters find that they must cover, for instance, car crashes or fires for whatever news source he or she works for. This topic does not cry out for public consumption and so Stern’s interest in bringing it to the attention of others proves that he does care for the population it relates to. I will say, though, that he seems very conscious of trying to relate facts and best practices without himself being a part of the data. This, I believe, only strengthens his findings.

Findings & Conclusions

Stern, through using this small college in New York, has found two things that interest me in my pursuit of finding better ways to recruit and retain Hispanic students at private, prep, and Catholic secondary schools. First off, know your own school and student body well and find students who fit profiles of success at your school. This seems very important. Schools must consciously reflect upon what types of students can and will do well by reflecting upon the types of successful graduates it has had previously. Once this reflection occurs, the rest follows much more smoothly. Secondly, support your students.  Stern (2008) quoted the assistant vice president once again, “What’s the key to attracting Latino students to a private college? The ‘holistic’ approach works best, Tragni said, combining one-on-one assistance from an advisor, identifying problems early, providing academic support and offering career development” (p. 12). I think one-on-one advisors, which is, of course, not something unique to other colleges and universities is vital here with Hispanic students, but, more so, it centers on how well these advisors are utilized. Students at post-secondary or secondary private schools need means of support throughout the entire school process. From Stern’s piece I’ve gleaned that the front-end recruitment process seems the most important with multiple means of support as something that is imperative as well.


Stern, G. M. (2008). Mercy College: A retention model for Hispanic students. The Hispanic    

          Outlook in Higher Education, 18, 55-57. Retrieved from



Don’t Believe the Hype: Indigenous Student Learning Styles May Not Conform to the Literature

Roppolo, K. & Crow, C. (2007). Native American education vs. Indian learning: Still battling Pratt after all these years. Studies in American Indian Literatures, 19(1), 3-31.

Written in Blood

I surrender to Roget’s pocket Thesaurus

I confess my crime of breaking into this container of words,

and slaughtering this poem with meta innuendo./

But I needed something. I wanted to gather the dust

of 84 warriors and 62 women & children. I robbed,

from this vault of words, language of the enemy, in hopes/

I could capture these people, allow their prayers to

Reach Wovoka in the final hour before I end this poem.

I wanted to know that I am not merely grieving from the guilt /

of that European blood that separates me from two worlds.

I need to know that I can be allowed my grief.

Sadly, I have failed. This 1961 Cardinal edition thesaurus/

I depended upon has betrayed me. Betrayed my Indian kin.

With this language there are times I feel I’m betraying myself.

In my search for synonyms for murder, I find Cain,/

Assassin, barbarian, gunman, brute

hoodlum, killer, executioner, butcher

savage, Apache, redskin.

(Midge, p. 212)


I am using Tiffany Midge’s poem (Harjo & Bird, 1997) to speak a truth about the relationship between Indigenous peoples and dominant-culture education. The legacy of 300 plus years of “Indian education” has left Indigenous communities fractured and traumatized. Our people are in various stages of quasi-assimilation (because we know it’s impossible for full assimilation as we are racialized as brown, or red as is often referenced), without identity, ashamed, and thankfully, recovering. In much of the literature education is lauded as the key to our future and our very survival (Campbell, 2007; Guillory & Wolverton, 2008; Harjo & Bird, 1997; Roppolo & Crow, 2007). Unfortunately, college degree attainment has not become widespread among Indigenous peoples in the U.S.  despite the various research examining this social problem (Guillory & Wolverton, 2008). This week, I turned to a small pilot study that took place at a tribal college on teaching Native American literature to Indigenous students, for a slightly different approach to Indigenous learning at the college level.

In Native American education vs. Indian learning: Still battling Pratt after all these years, Roppolo and Crow (2007) rely on the research about Indigenous students’ learning styles to construct an intensive week-long Native American literature course for five (mostly Cheyenne and Arapaho) students. The authors used two quantitative assessments—one to assess whether students are auditory or visual leaners and the other to assess if they are individualistic or collectivist. Upon finding that the students in their course did not test as the assessments predicted—visual and collectivist, the authors used their knowledge of Cheyenne and Arapaho culture to create an additional assessment to measure assimilation in hopes of explaining the contradictory results. Following are some questions on the assessment. Answering true to the first four and false on the last question reflect “traditional” responses.

  • If I acquire a beautiful object that I value greatly and someone I respect will appreciate it, I would give it to them as a measure of my respect and gratitude without regret.
  • Life experience is more important than book learning, though both can be beneficial.
  • Sometimes hard things have to be said, but it is important to say them in a good way so that people won’t be offended no matter how bad their own behavior has been.
  • My family’s needs are more important than my own.
  • If I went to somebody’s house after eating a large meal and they offered me food, I would politely say that I was full, but thank you, or find some other way of turning down the offer.

The authors also used qualitative data collected from student assignments and their own observations as participants in the research. They found that the literature on Indigenous education does not always apply and that researchers must be willing to change their methods mid-practice in order to impact learning. As the subtitle suggests, society is still using predetermined notions (current literature) of what Indigenous students are like to create an educational environment for them that may not work.

Theoretical Frameowork and Literature Review

The authors used a constructivist approach as a learning theory, philosophy and application in the classroom. Constructivism is the idea that students have their own cultural wealth they bring into the classroom and rather than ignore that capital, the instructors build upon it so that students can construct new knowledge upon the foundation of previously constructed knowledge which is beneficial for learning. I found the literature review sparse, however the authors used an article that reviewed the recent literature (as of 2003) on improving academic performance among Native American students. They also use literature that examined historical educational practices toward Indigenous peoples, the impact of cultural literacies, cultural variables, and cross-cultural assessment in learning environments and a few specific research studies about Indigenous peoples and education. I think they could have used more research to support their own. I am convinced that that the articles situate the authors’ contribution as innovative, however, I am not convinced that the 12 articles specifically about Indigenous students sets a solid foundation for the author’s contribution. It seems to me there could be a lot more support for their argument.

Data and Analysis

Three quantitative methods: Joyzelle Godfrey’s Assessment of personal tendency toward individualism or collectivism, Brainworks Personal Evaluation to test for audio/video preference, and their own 24-item survey to test for assimilation. The 24-item survey was created midweek to find out why students had tested as auditory learners rather than visual and individualistic rather than collectivist. Interestingly, the students’ scores contradicted the literature that theorized Indigenous students were holistic and visual learners. The authors also used qualitative data, the students journaling and classroom assignments as well as the author’s own observations as participants in the learning environment. The qualitative data offered more insight into how the students connected with the coursework and gave a richer description of how the students’ lived experiences impacted the work they completed in the class. The two quantitative methods that tested preferences for audio/visual and individualistic/collectivist revealed the students to be unlike what the data had shown for Indigenous students—which led to the creation of a new assessment.


The authors theorized that students tested strongly as both visual-auditory because the culture requires it. Engaging in traditional ceremony requires one to complete complex tasks often after very long instructions. One needs auditory focus to be successful. Also, attending ceremonies and watching what others are doing prepares one for the day it is their turn. The authors suggest their contradictory findings on students learning styles makes the case that we should not apply predetermined theory and practice to Indigenous students but must be flexible to change theory mid-practice to ensure real learning happens and that new pedagogies and theories can arise.

The authors’ findings are in line with the complexity of human beings and the idea that people adapt. Being willing to adjust your research methods mid-stream takes reflection, critical thinking and courage. To me, this is what it means to be innovative and to practice excellence. The authors did not accept the contradictions and report those findings, instead they set out to figure out why the students in their research contradicted the literature. In the end, the students’ final projects were a mix of visual representation, spoken, and written word.


Campbell, A. (2007). Retaining American Indian/Alaska Native students in higher education: A case study of one partnership between the Tohono O’odham Nation and Pima Community College, Tucson, AZ. Journal of American Indian Education, 46(2), 19-42.

Guillory, R.M., Wolverton, M. (2008). It’s about family: Native American student persistence in higher education. The Journal of Higher Education, 79(1), 58-87.

Midge, T. (1997). Written in Blood. In Harjo, J. & Bird, G (Eds.), Reinventing the Enemy’s Language: Contemporary Native Women’s Writings of North America (p. 212). New York: Norton.

Roppolo, K. & Crow, C. (2007). Native American education vs. Indian learning: Still battling Pratt after all these years. Studies in American Indian Literatures, 19(1), 3-31.

Using difference-education to make a difference

Stephens, N.M., Hamedani, M.G., & Destin, M. (2014). Closing the Social-Class Achievement Gap: A Difference-Education Intervention Improves First-Generation Students’ Academic Performance and All Students’ College Transition. Psychological Science, 25(4), 943-953.

How can difference-education make a difference in the success outcomes of first-generation university freshmen?  A recently published study, authored by Nicole M. Stephens, MarYam G. Hamedani and Mesmin Destin (2014), sheds light on the matter.

Stephens, Hamedani, and Destin conducted a study to determine if an educational intervention that highlights difference and demonstrates why difference matters would reduce the achievement gap between first-generation students (those whose parents do not have a four-year college education) and continuing-generation students (whom have at least one parent who has obtained a four-year college degree).  Using a convenience sample to recruit first-year students and financial incentives to entice them to participate, the researchers organized two moderated panels of college seniors to share their personal stories of how they have succeeded at that university.  The same group of seniors spoke at each panel and shared the same stories; the only major difference between the panels was that one panel included difference-education in the form of the panelists identifying their social-class backgrounds and then linking their stories to those backgrounds, whereas the other panel did not include such mention of experience-based difference.  The group of freshmen participating in the study (which included first-generation and continuing-generation students) were randomly assigned to observe one panel or the other.

In addition to observing the panel, participating freshmen were asked immediately afterward to complete a brief survey about what they learned and how they would use that learning to advise future incoming students, and they also filmed a brief video testimonial that they were told would be used to educate the following year’s cohort of freshmen (the researchers added this wrinkle to produce the saying-is-believing effect articulated by Yeager and Walton (2011)).  At the end of the year, participants also completed a survey designed to gauge their understanding of difference, how much they utilized available student resources at the college, and the success of their college transition as determined by a range of psychosocial measures such as levels of stress and student engagement.

The results are encouraging.  After eliminating outliers and controlling for other factors such as SAT scores and high school GPA, the researchers found that the achievement gap (measured by year-end college GPA) between first-generation and continuing-generation students who observed the difference-education panel was virtually eliminated!  In contrast, a significant gap emerged between first-generation and continuing-generation students who observed the standard panel that did not contain difference-education.  The researchers also found that, although there was not a significant difference in year-end GPA between the groups of continuing-generation students who participated in the study, the group of first-generation students who observed the difference-education panel had a much higher mean GPA than the group of first-generation students who observed the standard panel.  Similar patterns emerged in relation to utilization of college resources.

This seems like a sound study.  The researchers’ survey design and statistical analysis controlled confounding variables, and results were statistically significant. Moreover, the researchers used multiple methods of obtaining data.

I am intrigued by this research because it relates directly to my area of inquiry.  It also confirms other research articles I’ve read, my own personal observations of students, and conversations I’ve had with colleagues that support the notion that, although interventions such as academic skill development programs and financial literacy education can clearly be beneficial for first-generation students, educators must also be attuned to psychological factors such as self-efficacy and feelings of belonging and hope that can impact student success outcomes.  Students can be exceptionally bright, but if they feel like they don’t belong in college, if they don’t recognize that their struggles and challenges might be related to difference in their backgrounds rather than who they are as individuals, or they don’t seek help, their chances for success are diminished.

This study also has major implications for issues of access and equity in education, which is a major national agenda.  As the authors of the study wrote, the achievement gap between first-generation and continuing-generation students is well documented, and first-generation students are a large percentage of the student population.  Therefore, administrators who are seeking to improve their institutional graduation rates and promote student success should be aware of this study and consider how they might use the findings in their own context.

The researchers identified several areas for future study.  For example, they suggested studies on how similar interventions might affect other areas in which there are educational disparities.  This study has definitely given me ideas for my own research.  I would like to try a similar intervention at ASU.  The key will be finding a way to do it at scale.

Additional Reference

Yeager, D.S., & Walton, G.M. (2011). Social-psychological interventions in education: They’re not magic. Review of Educational Research, 81, 267-301.


Putting the cart ahead of the horse.

Zembylas, M. (2008). Adult learners’ emotions in online learning. Distance Education, 29(1), 71-87.

Putting the cart ahead of the horse.

As I continue down the path of gaining better insight into what research is out there around professional development, I’m also pursuing what research is out there in regards to adult online learning. This particular branch of my area of inquiry is quite interesting because it takes me all over the world. Adult online education has been a huge thing in European countries for decades now and much research has been done abroad. It is these particular articles that I find interesting because the focus and topic may be similar but often it is the lens that is different and, at times, quite revealing.

The journal that I most recently read centers around a qualitative study out of the Open University of Cyprus in Nicosia, Cyprus. This particular study followed the emotional ups and downs of twenty-two graduate students enrolled in a year-long, distance education course on the topic of multicultural education and social justice pedagogies. The majority of the participants were women and averaged an age of 36 years old. The researcher, Michalinos Zembylas, was the professor for the course and he gathers data around the emotions of the online students through a monthly journal, two in-person interviews at the start and end of the course, 867 email messages between him and the students, his reflective journal, field notes from the face-to-face meetings, phone conversations, student work documents, and his own planning.

His findings include: new online students experience both positive and negative emotions in relation to their new roles and the new setting in which they’re learning in, those emotions change over the course of the year, and men and women experience the outside pressures of being a student differently.

In general, I found there to many strengths to this journal article. The organization and overall coherence of this article to be good. It was very clear and understandable and particularly easy to read. I do think that his participation in the study might have made him write in a certain point of view that was then easier to read. I believe that elements of this work appear to be original to an extend and do offer something new to the field. Since the students were self-reporting their own emotions and causes for those emotions throughout the year-long course does seem to make this unique. I do think the socio-historical aspect of the setting in which the study took place also provides a uniqueness to the text that I’ll speak more about later. I found the literature review to be thorough and it really, naturally built up to the need and space for this particular study. The theoretical framework that was chosen, critical and postructuralist thinking, seems to make sense for the setting. This framework outlines placing and identifying emotion in a space between the public and the individual. Hence, the students self-reporting would occupy that same location. One last strength of the article, was the many direct quotes that were captured from student documents that really highlight and humanize the findings. The unique struggle that Zembylas’ findings found was the difference in emotional reporting and reasons between men and women. I will come back to this particular point later as I personally connected with it.

As interesting and easy as it was to read Zembylas’ work, I did find a variety of elements disconcerting in the research and make me want to offer them up as ways for improvement. Since this research centers around utilizing only qualitative data, it would seem wise to have incorporated some elements of quantitative research and even utilized another reviewer to ensure more reliability across the coding and results. I found the fact that he was their professor to be highly problematic to this whole design model. In the article, Zembylas mentions talking about the research the first night and inquiring how students felt about the study and his position of power. Having not hearing any negative responses, he offers them 10% of their grade if they’d participate. I know the rules about if you can’t and can “pay” someone to participate but this method seems to be completely against the established code of conduct. If the journal assignments were somehow aligned to the topic of the course, then maybe. One other similar concern was that the author never listed or illuminated what the specific interview questions were that he utilized in this study. In the author’s findings, he mentions that the student emotions changed over the year. First of all, I’m sure I, if I thought about it for a little while, could have written these findings just off of the top of my head. These “findings” didn’t appear to be so revealing or revolutionary. In fact, the “findings” were a let down to the development of the literature review and the scope of the study. And upon even deeper reflection, I don’t know how he made the “leap” from his findings to his very strong conclusions and implications. One conclusion he establishes is that the the emotions of the students changed over the course of the year. Again, that is so vague and obvious, that I could’ve written that sentence years ago without any data. What would’ve been revealing is an even deeper dive into the data and a more linear projection that truly demonstrated how their students’ emotions did change over time. There seems to be a chasm of disconnect between the findings and conclusions. I was also surprised to see citations from other articles in the Conclusions section. It would seem like these concepts and citations should be up in the literature review helping to build the case of why this study is important.

This article struck a personal chord because of the slightly tangential point it made about working women as professionals and students expressed more negatives around the many roles and responsibilities they have to upkeep along with their studies. I also over the past year and a half have had to face this same battle but I do think it speaks to something Zembylas mentioned. These women have gone back to college at a graduate level and are being forced to take on the evolution of equal opportunity and/or gender equality while still maintaining old family traditional values. I daily face this balance and have wanted to engage further in research around the concepts and tension between the expansion of gender equality or more access for women to ladder climbing experiences with the lack of equal fluidity in gender responsibilities shared in the family unit.

Lastly, I do think this study has opened up a new consideration for in my field of influence. The incorporation or explicit focus on user emotion in professional development that occurs online. Applying interventions that provides the students with innovative strategy of how to engage the importance of user emotion in a learning experience as well as its correspondence to particular events in the online platform.

In general, this article seemed to make quite a few leaps from their findings to their conclusions. So much so, that I almost thought that they had written a bunch of other articles where this data was further unpacked and graphics were utilized to support the statements made and helped the user track the same trends that then logically lead to an appropriate conclusion.

I will use the positives and improvements to make sure I don’t put the cart of my conclusions before the horse of my findings.

“Damned to be concrete”: Considering productive uncertainty in data visualization

Marx, V. (2013). “Data visualization: Ambiguity as a fellow traveler.” Nature Methods, 10(7), 613-615.

In their musings on the importance of uncertainty with regards to social networks and educational attainment, Jordan and McDaniel (In Press) bring to the forefront an interesting concept of “productive uncertainty” (pp.5).  This idea allows that while uncertainty is not always pleasant—and while learners will often seek to minimize it—the experience is not without value.  Marx (2013), while discussing the complexities and shortcomings common among data visualizations, expands upon this concept; uncertainty, particularly within a statistical realm, can illuminate new characteristics of the data or new methodologies that address shortcomings in collection or analysis.  However, data visualizations themselves can obscure or outright hide this level of detail.  So how do we visualize data in a way that is both simple and transparent?

“[With visuals], we are damned to be concrete” (Marx, 2013, pp. 613).

Marx (2013), using examples from genomic and biomedical research, poses an interesting question: In discussing scientific results, researchers often feel compelled to gloss over, if not exactly obscure, uncertainty in their data.  These questions can arise from inconsistent collection, imperfect aggregation, or even unexpected results.  However, these “unloved fellow travelers of science” (pp. 613) cannot exist visually in the type of “grey area” analysis that Marx contends they often do while in text.  When faced with creating an honest visualization, then, researchers must decide to what extent they will account for study uncertainty.  Marx, in explaining the potential impacts of this decision, advocates that researchers strongly consider two points: First, that uncertainty may have implication upon the data itself; and second, that a transparent consideration of uncertainty strongly impacts “what comes next.”

Thus, Marx (2013) is explicitly pushing productivity over negativity when reflecting upon uncertainty in data or the wider study; however, she is also acknowledging that even within the specific context of biomedical researchers, the pull to minimize uncertainty when broadly discussing results exists.

Down the rabbit hole: Analysis can create uncertainty too

One should also consider the process—largely mathematical, in this context—of moving from a raw dataset to a clean visualization.  Common steps for creating data visualizations, particularly in genomics and the biomedical sciences, often include aggregating data from different sources (and thus methods of collection) or summarizing large and complex markers into something more easily digestible.  By attempting to standardize disparate collection methods into something more uniform, or by summarizing disparate study groups or grouped variables, an important level of detail is lost.  These processes themselves can obscure data, which in turn obscures uncertainty for the end audience, whose exposure to this study may wholly lie in the visualization.  Going somewhat down the rabbit hole, this in itself can therefore create new uncertainty.

Certainly, simplicity is important in a data visualization; however, as Marx argues, researchers also have an obligation to consider that by glossing over details of uncertainty, or by creating new sources of uncertainty through their analyses, the wider community may understand their work less, or may make assumptions of their findings that are unfounded.

In particular, missing data presents a complex dilemma.  Marx (2013) gives the example of a genomic sequencing experiment, seeking to map a stretch of genetic material that contains 60 million bases:

“The scientists obtain results and a statistical distribution of sequencing coverage across the genome.  Some stretches might be sequenced 100-fold, whereas other stretches have lower sequencing depths or no coverage at all…But after an alignment, scientists might find that they have aligned only 50 million of the sought-after 60 million bases…This ambiguity in large data sets due to missing data—in this example, 10 million bases—is a big challenge” (pp. 614).

As opposed to data that is statistically uncertain, or uncertain by virtue of its collection methods, missing data is a true negative whose effect is difficult to truthfully express and explain.

So how do we show uncertainty visually?

Marx suggests several methods for including uncertainty visually when discussing data.  Broadly, she suggests including some representation of uncertainty within a visualization; this can be layered on top of the data visualized—for example, using color coding or varying levels of transparency to indicate more and less certain data.  A visualization can also account for uncertainty separate from the data, by using an additional symbol to denote certainty or the reverse, for example.  She also discusses contrasting analyses of similar (or the same) data that have reached differing conclusions; taking into account their methods of analysis, this inclusion of multiple viewpoints can also round a discussion of uncertainty.

In addition to understanding how to represent uncertainty visually, however, one should also consider how and when (during a study or study analysis) one should tabulate uncertainty.  One platform looking to incorporate uncertainty holistically into data visualization is Refinery.  In particular, Marx notes that this system seeks to find “ways to highlight for scientists what might be missing in their data and their analysis steps” (pp. 614), both addressing uncertainty situated in data and analysis.  As shown below, this system considers uncertainty at all steps throughout the data analysis, rather than only at the end, giving a more rounded picture of how uncertainty has influenced the study at all levels.

“The team developing the visualization platform Refinery (top row) is testing how to let users track uncertainty levels (orange) that arise in each data analysis step” (Marx, 2013, pp. 615).

In the graphic, the blue boxes represent data at different stages during analysis.  Orange, in the top row, represents the types of uncertainty that may arise during each analytical step, concluding that the orange error bars in the bar graph to the far right are much more comprehensive in their calculation.  The light blue bar in the bottom row shows the disparity, theoretically, when error is only taken into account at the end of an analysis.  While the magnitude of uncertainty may not be as significantly different as shown in the graphic, researchers are better able to account for what causes or has caused error in the top row; they are better able to situate their uncertainty.

A picture may be worth a thousand words, but do they have to tell one story?

Analyzing data is often a narrative process; however, as Marx (2013) alludes, there can be consequences to how one tells their story.  Washing over uncertainty, in both preparing and discussing results, can be misleading, limiting both a researcher’s true understanding of their own data, and collaborations or theories that use the data as a foundation for further study.  Marx, however, is not disparaging researchers who fail to consider uncertainty as dishonest; she is promoting the idea that considering uncertainty positive—or productive—can lead research in novel directions.


Jordan, M.E. & McDaniel, R.R. (In Press). “Managing uncertainty during collaborative problem solving in elementary school teams: The role of peer influences in robotics engineering activity.” The Journal of the Learning Sciences, 1-49.

Is Culture the New Dumping Ground?

Ladson-Billings, G. (2006). It’s not the culture of poverty, it’s the poverty of culture: The problem with  teacher education. Anthropology & Education Quarterly, 37(2), 104-109. Retrieved from

Howard, T. C. (2010). Why race and culture matter in schools: Closing the achievemen gap in America’s classrooms. New York, N.Y: Teachers College Press

Losen, Daniel J (2011). Discipline policies, successful schools, and racial justice. Boulder, Col: National   Education Policy Center.

The journal article, It’s not the Culture of Poverty, It’s the Poverty of Culture: The Problem with Teacher Education by Ladson-Billings (2006) asserts that anthropology should be a part of the teacher preparation program. The author describes how pre-service teachers take courses on philosophy, sociology, history and psychology but anthropology is typically absent from teacher programs. Ladson-Billings (2006) argues, “The problem of culture in teaching is not merely one of exclusion. It is also one of overdetermination.” (p.104) She describes overdetermination as “culture is randomly and regularly used to explain everything.” In her research the author describes how she collected data on pre-service and new to the profession teachers on their understanding of culture.

Data collection was not a strength of this article. Although, I was entirely engaged in the data the author collected through interviews, electronic portfolios and student journals the explanation of the data analysis was not detailed enough to duplicate. Ladson-Billings (2006) reflects on “critical incidents” captured through the data collected. She conducted the interviews at the end of the pre-service teachers’ field experience. She asked the pre-service teachers to tell her about a child that was difficult to handle in class. I was saddened to learn that most pre-service teachers described the difficult student as one that was not like them in race, gender or ethnicity. The majority of teachers chose African American boys as the student that was most difficult to handle in their interview responses. This reminded me of Losen’s (2011) work on Discipline Policies, Successful Schools and Racial Justice, where he refers to a speech by Secretary of Education, Arne Duncan, who suggests, “students with disabilities and Black students, especially males were suspended far more often than their white counterparts.” (p. 3)

One incident that the author reflects on is a conversation she had with one of the pre-service teachers. She describes how the pre-service teacher said, “The black kids just talk so loud and don’t listen.” Ladson-Billings asked the pre-service teacher why they thought that and the teacher responded, “I don’t know; I guess it’s cultural.” (Ladson-Billings, 2006, p. 106) As I read this, I flashed back to conversations I have overheard at schools when teachers are talking about the reasons why students are not successful, why parents are not involved or why students are not making good choices and the answer I often hear is culture. Ladson-Billings asserts that “culture has become the answer to every problem.” (2006, p. 106)

Through the data collection, the researcher invited pre-service teachers to consider their own culture. The majority of her pre-service teachers are white, middle-class, monolingual Mid-Westerners. I was astonished by their responses detailed  in the article. “They describe themselves as having ‘no culture’ or being ‘just regular’ or just normal.” (Ladson-Billings, 2006, p.107) I believe in order for teachers to understand and value their students’ culture, they have to know, understand and value their own culture.  I connected their responses to what Howard (2010)  refers to as the demographic divide where the majority of the teachers she interviewed are white and the majority of the student population are African American.  Howard (2010) explains how “cross-racial teaching and learning arrangements have the potential for varying degrees of misunderstandings between students and teachers, especially where teachers lack the training and competence necessary to effectively teach students from diverse groups.” (p .43)

Organization is a strength in the article It’s not the Culture of Poverty, It’s the Poverty of Culture: The Problem with Teacher Education. Ladson-Billings is masterful in weaving in and out of data analysis and conclusions. Although this article was not organized with typical headings found in empirical research, it was written in a way that was easy to navigate. The author was also succinct in the development of her argument. I was drawn in by the examples, stories and clarity of her writing.

Another strength of this article is the conclusions the author draws. Ladson-Billings draws three major conclusions throughout the article. One of the conclusions that Ladson-Billings draws is that pre-service teachers need to interact with students outside of the school setting. She reminds us of the importance of celebrating students’ success outside of academics. The author argues that this will support pre-service teachers in becoming “careful observers of cultures” for their students and themselves. (Ladson-Billings, 2006, p. 109) Another conclusion she draws is that the pre-service teachers need to experience schooling in other parts of the world. The last conclusion she draws is that pre-service teachers need to see identify their own culture and own it.

I believe researching pre-service teachers and new to the profession teachers’ understanding of culture is a meaningful contribution to the field of education. I think it is important for teachers to understand their own culture and the students that they interact with. I also believe the author raises an issue that I have seen and heard on many school campuses and that is blaming student failure on culture.  Culture should not be “the answer” or the dumping ground for failures that happen within the educational system.

“I think I can, I think I can, I….”

Sriram, R. (2014). Rethinking Intelligence: The role of mindset in promoting success for academically high-risk students. Journal of College Student Retention: Research, Theory, and Practice, 15(4), 515–536.

Consider your responses to the following three statements:  “a) you have a certain amount of intelligence and you really can’t do much to change it; b) your intelligence is something about you that you can’t change very much; and c) you can learn new things, but you can’t really change your basic intelligence (Dweck et al in Sriram, 2014).”  Not exactly, “I think I can…;” rather I think I can’t.”  Those statements are used to measure something called mindset, a construct studied (primarily in children) to determine their view on intelligence and consequently their ability to learn and be successful.  This article is one of the few that looks at mindset in college students.

Being convinced myself that what a person believes to be true is likely to become their reality, I was immediately drawn to Carol Dweck’s mindset research about how one thinks about their own intelligence (Yeager and Dweck, 2012).  People who think intelligence is malleable are said to have a growth mindset; they also tend to be more goal-oriented.  People who think intelligence is unchangeable are said to have a fixed mindset and are less motivated to put forth effort.  Sriram does a thorough and convincing job of providing support for the development of growth mindset in high-risk college students.   A growth mindset is important because, “besides prior academic achievement, the motivation and energy students apply to their education is the best predictor of their learning and development” (Sriram, 2014).

This article is readable, contributes to the literature, and suggests topics for further research and policy.  The author is methodical and thorough in his approach providing context and rationale for his study.  He works at Baylor University.  His study was of Baylor students who were considered academically high-risk for that institution.  Students were randomly assigned to either a control or experimental group.  Both groups participated in four short (15-minute) web based sessions over the course of four weeks as part of a remedial course they were taking.  The control group was instructed in study skills; the experimental group was exposed to the idea that intelligence can be developed.  Based on a comparison of pre- and post-tests (consisting of the three questions at the start of this blog) the experimental group’s view of their intelligence shifted to a more growth mindset.  They also reported exerting more academic effort.   Given the positive results, happily there were more students in the experimental group (n=60) compared to the control (45 students).

Sriram is not stingy on providing methodological and contextual details. He acknowledges that colleges and universities are spending a lot of money on programs to help students succeed.  With the influx of students to higher education, there are greater numbers of students who show up under-prepared.  Baylor, like most colleges and universities, wants to retain their students and help them to graduate.  Again like so many schools, they have created remedial support.  However, a significant amount of research indicates that remedial coursework does not increase students’ rate of graduation so Sriram set out to find something that might supplement remedial programming.  The few studies on college students’ mindset suggested that a growth mindset might increase motivation and effort.

The reader is meticulously guided through the author’s data collection.  He clearly defines terms and uses the same terms often enough to help the reader make connections.  The analysis is detailed.  He addresses internal and external validity and treatment fidelity.  He examines the pre/post test differences.  He tells the kind of analysis he is doing – ANOVA and MANOVA looking at multivariates and univariate and degrees of freedom and conditional effects.  I was, however, left with curiosity about the details of each of the 15 minute on-line intervention sessions.  The author detailed seven distinct parts of the experimental sessions including a quote, a video, reflection questions, and information about the brain and said that the control groups experienced similar activities.  An appendix with the details of each session including quote and url’s to videos would allow for others to duplicate the intervention.  Perhaps he is saving that level of detail, though, until his results also show higher achievement.  Despite the fact that students in the experimental group displayed more of a growth mindset based on post-test results and greater study efforts than the control group, they did not show higher academic achievement based on end of semester gpa.

The author has possible explanations for that and other surprising results.  He suggests that the brevity of his intervention may be to blame for the lack of increased academic achievement and suggests that future studies look at increasing that.  A surprise was that, on closer examination of the conditional effects, students of color and men did not show increased effort; that was limited to females and students of European descent.

This article reads like a story – a story with new and more complex details that reveal themselves at each reading.  After reading (and re-reading) this article, I am considering adding an intervention to enhance students’ growth mindset to workshops I am constructing on motivation with a colleague for our institution’s new early alert program this fall.  The program will focus on students in developmental classes.  Attendance at the four workshops is optional and we were not anticipating that students would necessarily attend all four.

Considering how to make this research humanizing seems to me it must include dialogue.  In this study the intervention is solely web-based, there is no interaction with other students or instructors.  Getting feedback from the students on their attempts to enhance a growth mindset – and before that being transparent with them about the questions the researcher is attempting to answer – would humanize the research.

If students – especially those who come to college under-prepared – could think about intelligence as dynamic rather than static, that could improve self-confidence resulting in increased motivation to engage in more strategic academic behaviors.  Then the mantra, “I think I can…” will take hold propelling the student to greater effort and, hopefully, college success.

Yeager, D. S., & Dweck, C. S. (2012). Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed. Educational Psychologist, 47(4), 302–314. doi:10.1080/00461520.2012.722805

Sriram, R. (2014). Rethinking Intelligence: The role of mindset in promoting success for academically high-risk students. Journal of College Student Retention: Research, Theory, and Practice, 15(4), 515–536.

College Persistence…Mentoring Matters

Bordes-Edgar, V., Arredondo, P., Kurpius, S.R., & Rund, J. (2011). A Longitudinal Analysis of Latina/o Students’ Academic Persistence. Journal of Hispanic Higher Education, 10(4), 358-368.


It has previously been shown that there is positive correlation between student success and participation in mentoring programs in higher education (Salas, Aragon, Alandejani, & Timpson, 2014; Bordes and Arredondo, 2005). In the article “A Longitudinal Analysis of Latina/o Students’ Academic Persistence” authors Bordes-Edgar, Arredondo, Kurpius, & Rund (2011), use data from a longitudinal study to determine what factors might impact student persistence in higher education, and then re-examine the actual impact of those same factors 4.5 years later. The factors examined included decision making, self-efficacy, mentoring, value of education, family valuing of education, perceived social support (family and friends), and academic factors (including entrance exam scores, high school GPA, and college GPA).


Participants in the survey were Latina/o students from a southwestern university. In the original study, there were 112 1st-semester, freshman students. Of those 112 students, 76 students (20 men and 56 women) agreed to be part a follow-up study. Of the 76 students who were part of the follow-up, twenty-one (6 men and 15 women) were still enrolled, 25 (4 men and 21 women) had graduated, 25 (8 men and 17 women) had dropped out, and 5 were withdrawn for academic reasons. It should be noted that those who were withdrawn for academic reason, were not included in the final sample.


After receiving consent from each of the participants, researchers accessed student data on all participants involved in the study. Based on admission data gathered, participants were grouped into one of four separate categories. The categories were “graduated, enrolled, dropped-out, and academically withdrawn” (p.361).

The original survey included demographic information and self-report measures. In both the original survey, as well as the follow-up survey, there were several instruments (scales) used to measure correlation of various factors to student success, including, student decision making, self-efficacy, mentoring, value of education, family valuing of education, perceived social support (family and friends), and academic factors (including entrance exam scores, high school GPA, and college GPA). Correlation was determined using Cronbach’s alpha test.


Results of the study indicate that students who persisted from the initial survey to the second part of the survey (4.5 years later), received more mentoring, made more positive persistence decisions during the initial phase of the survey (i.e., valued education, had positive self-belief in their own ability, received positive social support), and had a higher high school GPA.

Social support from friends was initially a strong predictor of persistence with the freshman. As it turned out, the importance of social support from friends was shown to diminish over time, and students had to rely on other forms of social support (i.e., mentoring), to be successful.

It was noted in the article that students who are isolated from friends, are more likely to drop out of college. The strongest correlation for social support as it relates to student success and persistence, came from mentoring, in which students who graduated perceived that mentoring was a critical reason for their success.

It should be noted that the authors examined the racial/ethnic background of the mentors to see whether having a Latina/o mentor made a difference in persistence. Results of the study indicate that there were no differences found.

Limitations / Recommendations:

As it pertains to my own action research, the most notable limitation in this study, is that it focused on Latina/o students. For my own research, I wish to expand to include first-generation, low-income and students with disabilities. Other limitations include a small sample size, and that most participants were of Mexican origin, so generalizability to other Latina/o groups may be limited. In addition, there were significantly more women than men in the study. I am not sure whether this would have a direct effect of the outcome, but is worth noting and examining in future action research.

Application to my own Action Research / Discussion:

The most telling result of the study, is that the initial connection that we have with students is the most important in helping students to be successful. The initial self-beliefs of students is a strong indicator of future success. Therefore, mentors that are very intentional in how they establish that relationship with students during their freshman year will be critical to the students future success. For example, having the mentors develop and build student self-efficacy.

As it relates to social support, we might assume that social support from friends will increase the likelihood of student success. Results actually show that the importance of friend support diminishes over time. Establishing that social relationship with a mentor during the initial phase of a student’s college career will be important.

It was noted in the article that development of partnerships between high schools and colleges were critical, as student success in high school is a strong predictor of their success in college. As it pertains to mentoring, high school staff could work with college staff in making arrangements to mentoring assignments to happen prior to students start in college. As noted in the article “If students have a mentor at the beginning of their college career, they are more likely to succeed” (p.365). In looking at future action research, explore how ASU might partner with high schools to better prepare students to transition, help those working with high school students understand the importance of having a strong GPA in high school (as a predictor of college success), and establishing mentoring relationship early in their admission to the university.

It has been shown that mentoring has a positive impact on whether college students are successful. At Arizona State University, I have seen the impact that mentoring has had on student success. While the survey focused on Latina/o students, I believe the mentoring component could be applied to any at-risk student group. The student groups that I wish do my own action research include first-generation, low-income and students with disabilities.


Bordes-Edgar, V., Arredondo, P., Kurpius, S.R., & Rund, J. (2011). A Longitudinal Analysis of Latina/o Students’ Academic Persistence. Journal of Hispanic Higher Education, 10(4), 358-368.

Bordes, V., & Arredondo, P. (2005). Mentoring and 1st-year Latina/o college students. Journal of Hispanic Higher Education, 4, 114-133.

To Compete or Collaborate – Status in group

Spataro, S. E., Pettit, N. C., Sauer, S. J., & Lount, R. B. (2014). Interactions Among Same-Status Peers: Effects of Behavioral Style and Status Level. Small Group Research, 45(3), 314–336. doi:10.1177/1046496414532532



Looking at group development and collaboration it is important understand how titles, status, and implied hierarchy of organizations have an effect on the desired outcomes set forth by the group.  Spataro, Pettit, Sauer, & Lount (2014) set out to determine what the effects are on group dynamics and willingness to collaborate in and within status groups.

Through the research Spataro et al., took a group of 61 university students, had them complete a general knowledge quiz, and a NASA survival quiz.  Once finished, participants were given a status of “high”, “middle”, or “low”.  These status’ were not based on any results of their test, but rather as an imposed control to make sure that at least one member of each status was represented in a small group.  They were then given “feedback” on their quizzes from other members; the feedback was manipulated so that it was either seen as competitive, or collaborative in nature, and from what “status”.  For this test the competitive feedback was more direct and accusatory of why answers were chosen, and the collaborative was more exploratory and asking how answers were chosen and that a discussion of rationale would be sought after.

After the feedback was given, participants were asked to how likely they would be to work with each member based on the feedback given. The researchers found that there was a higher level of willingness to work together in a collaborative manor among the “high” status members.  The “low” and “middle” status members were willing to work in a collaborative manor at higher rates than competitive.  However, those in “low” status were just as likely to work in a competitive group.

Overall, Spataro et al., state that the perceptions of status and the implied hierarchy that comes with titles have an effect on members willingness to collaborate over compete, as the lower statuses have a desire to stand out as an individual to gain higher status.  Where as those in a higher status find greater benefits from learning from others and sharing with those in similar titles.  They did state that the lower and middle status will work with higher status in collaborative levels.

Personal Application:

Looking at the results and background of this study, I see this play out in my own work.  Competition being set forth by the structure of title and rank, as it relates to advancement of position, rather than knowledge base and skills brought forward.  When thinking about areas of intervention I would like to implement, I have to wonder if having a group representing multiple departments, with individuals of varying “status” at the same table, will there be a natural divide in the group when goals and initiatives are set forth to accomplish.

Even though we should all be working on the same goal of student retention, regardless of college affiliation, as that is the common charge of departments.  I still have concerns that there may be some that the connection to individual departmental goals will supersede the desire to work as a collective. Furthermore I am worried that status of individuals in the group will overshadow or deter some from working in a collaborative group effort to assist all students.

Limitations in my practice:

I see the limitations coming out in who will be the participants, and I also have to consider that this study was done with a group of relative strangers.  The group that I will be working with has preexisting working relationships in other areas, or have personal relationships outside of their titled positions.

As stated in the report, this does not take into consideration “control over resources” (Spataro et al., 2014,  p. 328), as I know some of the more established colleges have more financial, and personnel resources.  I need to get an understanding of what personal characteristics someone might be looking for in those one collaborates with.

Final thoughts:

I found this article helpful in framing out a reason as to why or why not groups working together are successful or not, more so if they are as successful as they could be due to individuals perception of status in the group, where they stand, and what can be obtained from standing out or working together.   If through working with groups I can identify if someone is being more competitive and intentional holding back, or stunting somebody else’s opportunity to contribute perhaps I can get help that individual see the benefits of a more collaborative approach, as in what can be learned from the group and the ease at which outcomes can be achieved.


Spataro, S. E., Pettit, N. C., Sauer, S. J., & Lount, R. B. (2014). Interactions Among Same-Status Peers: Effects of Behavioral Style and Status Level. Small Group Research, 45(3), 314–336. doi:10.1177/1046496414532532

Authentically Motivating Students

Willems, P. P., & Gonzalez-DeHass, A. R. (n.d.). School – Community Partnerships : Using Authentic Contexts to Academically Motivate Students, 22(2), 9–30.

School-Community Partnerships: Using Authentic Contexts to Academically Motivate Students by Patrica Willems and Alyssa Gonzalez-DeHass discusses ways which the school and community can work together to form a partnership where students can learn by engaging in relevant learning activities.  They ideas are supported with research from three different models; authentic instruction, problem-based learning and service learning, that provide students with an environment with real, interactive life examples for learning and allows them the opportunity to participate in the learning made possible by community partnerships.

The research is very well organized and is separated by headings and subheadings. The writing and lay out is plain and easy enough to follow that it could function as a How To book on building school-community partnerships.


The content of study by Willems and Gonzalez-DeHass is a windfall of information that contributes to the field of research that I am interested.  This is exactly the type of research I imagined prior to starting classes. As it relates to what was discussed in class, this is the narrowing of resources in the midst of the breadth of material we’ve covered so far.  There are several factors I have yet to explore. For example, “which methodologies have the best chance of addressing teachers’ needs to meet significant curricular objectives amidst pressure for accountability and time demands associated with statewide standardized testing?”(Willems and Gonzalez-DeHass, 2012, p.25) There is also “limited time to meet, identify, and contact community members” (Hands, 2005, p.71) not to mention the   There is still more to uncover and that will require an even more narrow focus but this is an excellent start on my journey to gathering resources and gaining perspective.


The theoretical framework, “student academic success is best achieved through the cooperation and support of schools, families, and communities” (Willems and Gonzalez-DeHass, 2012, p.9) aligns perfectly with my topic of inquiry.  I want to bring the school where I work and its local community together for the benefit of student motivation.  I’d like for the community to witness results like the Problem Based Learners who had “higher levels of intrinsic goal orientation, task value, use of elaborative learning strategies, critical thinking, and metacognitive self-regulation in comparison to students instructed in a more traditional teacher/textbook-centered fashion” (Willems and Gonzalez-DeHass, 2012, p.21). I believe this research is timely, which will carry  weight in attempting to create change.  Its relevance is demonstrated by the service learning model, which is “increasing in popularity, with some estimates showing that approximately 30% of all public schools and 50% of high schools include service learning as part of their curriculum.” (Willems and Gonzalez-DeHass, 2012, p.23)


The study by Willems and Gonzalez-DeHass did not discuss their own collection of data.  Instead, they were informed by a collection of research from a variety of sources. To begin, the discussion connected and fit together references to “the literature that address the social contexts of learning, including that of situated learning, social constructivism, and learner-centered education.” (Willems and Gonzalez-DeHass, 2012, p.10)  Next, Willems and Gonzalez-DeHass provided a section titled, “Suggestions for School-Community Partnerships” (Willems and Gonzalez-DeHass, 2012, p.13) which revealed some of the pitfalls and necessities of establishing partnerships.  Finally, three different models; “authentic instruction, problem-based learning and service learning” (Willems and Gonzalez-DeHass, 2012, p.9) were discussed which actually inspired me to make notes and begin planning some details for my own innovation that I may otherwise have over looked.  I have a model now of how I will communicate my intentions to the stakeholders which will be invaluable for coordinating their efforts.


Some of the examples that are compelling are the ones that teach students through doing instead of observing through the words in a textbook.  The advantage is that students have the school to support their access to areas of the community that would otherwise be unavailable. As students in the youth participatory action research “by learning what resources and opportunities other schools offered, the school visits gave them a context to understand their own schooling experience” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p.9), it will be my goal to provide similar experiences at my workplace for similar results. In view of equitable education there is no text books required.  Students will have real world tools at their disposal. Lastly, the impact a program like this will have on the students is beyond measure. There is a benefit in every aspect of the program. First, students will be more intrinsically motivated from their experience and exposure to achievable possibilities. And “By including students in identifying genuine needs in the community, they are more likely to see their involvement as making a significant difference even as they further their own academic learning” (Willems and Gonzalez-DeHass, 2012, p.24).   Second, students will get to experience a variety of opportunities to help them to define their interests. More importantly, “infusing these opportunities for contextualized learning into academic activities will help students begin to see the meaningfulness of academic subject matter and its relevance beyond the classroom setting.”(Willems and Gonzalez-DeHass, 2012, p.25)


The study by Willems and Gonzalez-DeHass did not conclude with any result to reflect on.  They make a very reasonable case to move towards building partnerships. The final comments are proposals for future work that will need to be done “that will require ongoing discussion and reflection in the educational community.”(Willems and Gonzalez-DeHass, 2012, p.25)  Some new ideas that I will have to consider will be how to present this innovation to the district office level.  While brainstorming ideas I suspect that their primary concern will be the expenses involved, and what the implications are on testing and student achievement.




Bautista, M.A., Bertrand, M., Morrell, E., Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the council of youth research UCLA, 115(100303), 1–23.

Hands, C. (n.d.). It ’ s Who You Know and What You Know :  e Process of Creating Partnerships Between Schools and Communities, 63–84.

A Brain for Business

Rock, D., & Sctiwartz, J. (2007). The Neuroscience of Leadership, 10–17.


Concentrating on three main parts of the brain:  the prefrontal cortex, the basal ganglia, and the amygdala, Rock and Schwartz (2007) explain some of the functioning of the brain and implications for leaders of change.  The prefrontal cortex is where working memory resides and where new information is processed.  The basal ganglia is used in routine activities that we know how to do well without a lot of conscious thought and links simple behaviors from different parts of the brain.  If someone wants to change a routine or process that is already well-known, he has to concentrate on that change until it is embedded in his memory.  This means he has to override what is stored in his basal ganglia, a low-energy part of the brain, by using his prefrontal cortex, a high-energy part of the brain.  This expenditure of energy and effort causes an actual physical sensation of discomfort.

The brain also has the ability to detect “errors” or patterns that are different from what is expected.  The unexpected input causes the brain to give “strong signals that use a lot of energy, showing up in imaging technology as dramatic bursts of light” (Rock & Schwartz, 2007, p. 12).  These bursts of energy alert the amygdala, which is where the primal emotions of fear and anger are located.  They also take energy away from the prefrontal cortex.  So, when something is different from what we expect, we reduce the energy used for logical thought and increase the energy used for fear or anger.

The implications are that change is truly physically painful as high-energy areas of the brain are at work. Change arouses the basic feelings of anger and fear, which can inhibit learning. Focusing on new behaviors is powerful and can actually change the pathways in the brain, especially if the focus is repeated until the change has moved from the prefrontal cortex (working memory) to the basal ganglia (long-term, automatic memory).

Strengths and critiques

This is a well-organized article that takes a dense subject and makes it understandable.  With the projected audience of business leaders, the authors probably assumed that the readers do not know much, if anything, about brain functioning and research.  The practical, clear advice delineated in each section is supported with brief technical explanations of the workings of the brain.  Everyday examples help exemplify the technical points.

This is a unique contribution to the field of business management because few authors combine the fields of management and neuroscience.  The authors,

Dr. Jeffrey Schwartz, M.D. and Dr. David Rock, are unique because they come from different fields, yet their collaboration is essential for applying the findings of brain research to other fields.  Dr. Schwartz is a psychiatrist who has done research on brain functioning, concentrating mostly on brain imaging and obsessive-compulsive disorder, and Dr. Rock is an international business professor who has written articles and speaks widely about the implications of brain research in business.

The main drawback of this article is that the studies that contributed to this body of knowledge are not identified with in-text citations nor with an end-of-paper bibliography.  Schwartz & Rock are respected in their fields and are reliable sources of information.  The researchers for various studies are noted in the text, but direct reference to a particular journal is not included.  Perhaps Schwartz & Rock did this by design so that the article would be more accessible to the general public and to business professionals who are not academics.


I think this article informs some of the work of Dr. Michelle Jordan.  In our class discussion with Dr. Jordan (2014), she pointed out that uncertainty can be productive and is essential for learning.  However, in a study of uncertainty in a robotics task in fifth grade collaborative groups, one student, Roy, had difficulties with interpersonal skills as part of his group seemed to ignore or chasten him.  Jordan and McDaniel (2010) comment, “So salient was Roy’s need to resolve his relational uncertainty that he seemed unable to attend to the robotics task” (p. 24).  Roy’s brain was reacting to an “error” where what was happening in his group (they were ignoring him) and what he expected were different.  His brain is now using a large amount of energy to relay error messages and to activate the amygdala, the home of primal fear and anger.  This use of energy reduces the ability of the prefrontal cortex to process the higher-order thinking necessary to attend to the engineering problem.

Further study

This article focuses on the implications for business management and change.  However, the field of education can also benefit from this type of study.  It was a good place to see how neuroscience can inform practices in another field.  Many of the ideas are directly applicable to an educational setting.

Also, I would like to go back and delve in to some of the actual scientific studies mentioned.  The field of neuroscience is still new.  Much of the research is recent and more implications may be found as the research advances.


Jordan, M. E., & Mcdaniel, R. R. (2010). Managing Uncertainty During Collaborative Problem Solving in Elementary School Teams : The Role of Peer Influence in Robotics Engineering Activity, 00(2002).

Rock, D., & Sctiwartz, J. (2007). The Neuroscience of Leadership, 10–17.

Do Educators Utilize PD?

Doherty, I. Evaluating the impact of professional development on teaching practice: research findings and future research directions. US-China Education Review, A, 703-714. Retrieved June 12, 2014, from


Professional Development (PD) that is centered on meaningful learning activities is professional development that is generally considered to be highly effective. In his article, Iain Doherty (2011) sought to address whether or not there is a correlation between satisfactory participant experience following a professional development session and educators actually implementing changes and utilizing skills and strategies learned in their teaching practice (Doherty, 2011). Previous research suggested that meaningful professional development focuses on several principles: 1) contextual realism that intimately connects with teaching practice, meaning PD modules are linked to challenges and practical teaching situations; 2) content that allows learners to connect new information with preexisting schematic frameworks; 3) the utilization of authentic activities that mimic how new information can and will be used in future activities; 4) offering multiple and diverse perspectives, and; 5) collaborative reflection that promotes articulation of new knowledge (Doherty, 2011).

Doherty (2011) utilized PD modules that were built with the aforementioned considerations firmly in mind. They gave University-level educators in New Zealand information regarding the implementation of various technological tools that could enhance the learning of their students. These tools included things like blogs, social networking sites, Wikis, among others (Doherty, 2011). Further, the PD sessions were not a sage-on-stage/sit-and-get style session; each educator in attendance had the opportunity create accounts and begin to use them during the session. To measure his results, Doherty (2011) gave the educators a “pretest,” that asked them to assess their own familiarity with the various web-based tools that they were about to learn about. Following the session, participants were, again, asked to self-assess their own knowledge, awareness, familiarity, and ability to implement the tool into instruction. Doherty (2011) found that participants were significantly more knowledgeable about the resources following the training, than they were before it.

To truly assess their own effectiveness, they came back to the educators three months after the PD modules had been completed and gave a survey designed to assess whether the participants had, in any fashion, begun to implement or use the knowledge gained in the PD sessions in their instruction.  Doherty (2011) found that the vast majority (91-96% depending on the technology) of participants had not utilized any of the technology showcased, despite very strong reviews immediately following the sessions. Doherty then sought to supplement his quantitative results with qualitative information, ascertained through interviews with willing participants. Doherty’s (2011) sample size had diminished from an initial 27 to only seven who agreed to the interview; only one of the seven had made use of one of the multiple technology tools in their instruction, and the others were unable to articulate the reasons as to why they had not begun to implement learned strategies.

Upon reflecting on Doherty’s (2011) methods and results, there are both connections and areas of strength and weakness, each of which I want to take a moment to address in turn. There are a number of connections of this research to my own community of practice. One of the things that we emphasize in my role is the follow up to ensure that educators 1) feel supported as they begin to utilize the methods discussed during the actual PD session and 2) actually implement the strategies and tools into their professional work. I think that had Doherty offered on-going implementation support to the educators, he may have seen significantly higher rates of tool utilization. I know when I have been a participant in professional development sessions, I’ve  left feeling very motivated by all that I am able to do with the new tools and strategies, but that if I don’t begin to utilize them almost immediately, that I begin to lose understanding of the capabilities and how to integrate them into instruction.

One of the strengths that Doherty’s methods offered was the manner through which he assessed his participants’ knowledge before the session, after the session, and gauged their implementation by following up with attendees three months after the modules had been completed. This gives a good understanding of, 1) at what comfort and familiarity levels those in attendance entered the session, 2) the effectiveness of the facilitator(s) in communicating the desired knowledge to the participants, and 3) how valuable the content was to the educators by assessing the rates at which they actually utilized the information conveyed. This approach was a strong one, as it assessed the participants are various, predetermined intervals, providing information that a short-term data collection period wouldn’t even come close to measuring.

Another particular strength offered by Doherty’s procedure is the application of interviews and qualitative methods to supplement the quantitative information. Doherty (2011) chose to interview participants to pinpoint why and how participants chose to, or in his case, chose not to utilize the information conveyed through the professional development sessions. Though they couldn’t self-identify the root causes for their inaction, the process of interviewing participants, in addition to a simple post assessment, offers invaluable insight that might not otherwise be communicated to the researcher. This research model definitely provides a framework that I can utilize as I begin to plan for my own innovation in the area of professional development, combining both short- and long-term quantitative data, as well as qualitative data to provide further information.

The last area I want to address regarding Doherty’s (2011) methods was a lens he lacked, through which he ought to have collected and analyzed data to understand an even more meaningful perspective on the role of professional development in education.  In the introductory paragraph, he writes, “[professional development] is important to improve and enhance student learning” (Doherty, 2011, p. 703). If educators are tasked with improving outcomes for students, and professional development is meant to play a role in that charge, then the improvement in student performance, either academically, socially, behaviorally, or otherwise, should be an essential consideration when measuring or assessing the effectiveness of any session, content, or implementation. Given that Doherty (2011) mentions that purpose of professional development, I thought the perspective that could be offered by looking at the change in student outcomes would have been a valuable lens through which he could have collected and analyzed data.


Screen Test: E-Reading Comprehension

Margolin, S.J., Driscoll, C., Toland, M.J., and Kegler, J.L. (2013). E-readers, computer screens, or paper: Does reading comprehension change across media platforms? Applied Cognitive Psychology, 27, 512-519.

I never thought it would happen to me. I am, and always have been, a bibliophile. I love books for the stories within their covers, sure, but I also love books as books. I love spines and endpapers and embossing and cover art and epigraphs and acknowledgements and author photos and notes on the type and and those ruffly cut edges on fine hardcover editions. I still have the copies of Jane Eyre and Pride and Prejudice that my older sister, the best reader I know, bought for me for my 10th birthday. Touching their faded covers, I can still summon how those books felt like the kid sister’s longed-for invitation to hang out with the big kids. One of my fondest sensory experiences is the  squeak of the plastic library cover on a book giving way to yield the sweet, pulpy smell of paper.

I resisted the e-reader, I did. When people argued its merits, telling me I could bring 500 books on vacation with me, I turned up my nose. No, I scoffed, I love the ritual of selecting which book gets the privilege of hopping from bookshelf to suitcase; I love when I finish a book on vacation and so, stranded in Glenwood, Minnesota, with nothing to read, I go into that used bookstore and buy whatever they have that looks pretty good. This is how I ended up reading Geek Love by Katherine Dunn. If I’d brought 500 preselected books with me, that book would never have found me and thank God it did. And then it happened:

Reader, I bought an e-reader.

Truth be told, I’m on my second e-reader. My first was a Kindle, and a few years after I bought it, I upgraded to a Kindle Paperwhite. I don’t really know why I succumbed in the first place. When Amazon announced the Kindle in 2007, I got a good head of steam up about it and swore I’d never give in. I have thousands of books. And I have a hard time parting with them. Before and after I’ve read them, they remain, stacked and spilling from shelves and boxes, in every room of my house. But my affection for gadgetry must have won out, because in 2009 I told my husband that, gee, well, maybe, for my birthday, I guess, perhaps, I’d like to have a … Kindle? He was all too happy to oblige, because he was the one who’d moved all those thousands of books, box by heavy box, when we bought our first home together.

I was surprised to find that I loved the Kindle. I didn’t even need to warm up to it. It helps that the designers have paid attention to the tactile and sensual aspects of reading, so the thing feels like a book in my hands and I can “turn” the page either with a tap on the screen or with a swipe something like the swipe of a licked fingertip. I will say this: My Kindle is only for reading. Deep reading. Immersive reading. Longform reading. Book reading. I do not read magazines or periodicals on it (I read those on my iPad!). It does not have browser capability (again, iPad). Although it does have some interactive functions–for instance, I can hold my finger down on a word and be taken to a built-in dictionary for a definition–I don’t make frequent use of them. It also has functions to allow me to annotate, “highlight” or “clip” portions of text, but I use those infrequently, as well. Truth be told, I am not a heavy annotator in my pleasure reading, and I never have been. Much in the same way that the dog breeds I like best are the ones that are most like cats, I chose the e-reader that was as much like an analog reader–ahem, book–as possible. I deliberately selected the Kindle for the fact that it’s not backlit (like the iPad) and therefore doesn’t result in eye strain, headaches, or fatigue (for me). I also made a conscious choice not to get an e-reader with browser capability or the ability to watch movies (the Kindle Fire, for example).

There are, of course, pluses and minuses to the Kindle. I’m one of those people who can remember that a passage or image was on, say, the bottom half of a left-hand page about a third of the way through the book. With the Kindle, all the “pages” are oriented identically. It’s harder for me to find that passage or image that I remember. On the other hand, I can use the search function to locate a remembered word or phrase from the passage. (Another bonus: When I want to analyze recurring imagery on my own or with my students, I can, for example, search for all instances of relevant words: boat imagery, colors, whatever.) I believe I read more with my Kindle, because if I finish a book at the doctor’s office, I can immediately start another. In fact, I can immediately buy another. Downside: On the Kindle, I can’t tell how long a book is. I can’t tell when the portion in my right hand starts to weigh less than the portion in my left hand so I know to start rationing it more slowly to make it last. On the Kindle, books just up and end on me. And then I’m confronted with a rude-seeming invitation to rate the book I read (how crass!) or buy another book. Can we just cuddle for a few minutes? Geeze. Of course, sometimes it’s a good thing that I can’t tell how long a book is. With the Kindle, I don’t reject titles because they’re 700 pages long and I’m not sure I’ll have the time. I just dig in and keep plugging away. When I secretly wanted to read a trashy Jodi Picoult book but I didn’t want anyone to know, my Kindle kept my secret. On the other hand, thanks to the popularity of Kindles, Nooks, and other e-readers, I can no longer survey a waiting room or airplane to find out what book is the Cold Mountain of this year. I once met and fell in love with a man on a plane because he was carrying a copy of Raymond Carver’s Cathedral. Somehow, I doubt that “So. How about the battery life on that Kindle, eh?” would have, well, kindled the brief but beautiful romance that followed.

As a reader and a teacher, I have often wondered if the type and quality of reading I’m doing on my Kindle is comparable to the the type and quality of reading I always did (and sometimes still do) on paper. This question has increased relevance to me now, as my school has recently decided to “move away from printed copies of textbooks and towards a greater adoption of digital resources and eBooks” (J. Boehle, personal communication, February 28, 2014). To that end, “all students in grades 7-12 will be required to Bring Your Own Device (BYOD) to school starting in the fall of 2015” (J. Boehle, personal communication, May 29, 2014). We teachers received our school-issued iPads last week (bringing my household total of Apple devices (iPads, laptops, and iPods) to a slightly embarrassing 9. This doesn’t include the treasured Kindle or my husband’s PC computer. Incidentally, we are a household of two people and three cats. We have more computers than we do sentient beings).

Reading “is a process that, once learned, allows an individual to mentally represent written text” (Margolin, Driscoll, Toland, & Kegler, 2013, p. 512). As simple as that sounds, reading is a cognitively complex act, and there are many theories about what exactly is going on when I curl up with an afghan, chew on the end of my ponytail, and get lost in Anna Karenina for the sixth time. Hoover and Gough (2000), operating on the Southwest Educational Development Laboratory’s (SEDL) framework of the cognitive foundations of reading, explain that “reading comprehension (or, simply, reading) … is based upon two equally important competencies. One is language comprehension–the ability to construct meaning from spoken representations of language; the second is decoding—the ability to recognize written representations of words” (p. 13). Furthermore, each of these two abilities depends on “a collection of interrelated cognitive elements that must be well developed to be successful at either comprehending language or decoding” (Wren, 2000, p. 20). Wren (2000) explains that, according to the SEDL framework, the elements that support language comprehension and decoding toward reading comprehension are as follows (p. 18):

  • Background knowledge
  • Linguistic knowledge
  • Phonology
  • Syntax
  • Semantics
  • Cipher knowledge
  • Lexical knowledge
  • Phoneme awareness
  • Knowledge of the alphabetic principle
  • Letter knowledge
  • Concepts about print

Each of those elements could form the basis of a nuanced and lengthy exploration; for the purposes of this discussion, they serve only to underscore the cognitive complexity of reading. That is, we will take as a given that reading comprehension relies on an intricate set of brain tasks. The question before us is whether reading on a screen–specifically on an e-reader like a Kindle–as opposed to reading on good, old-fashioned paper, allows us to achieve that end goal: reading comprehension. That’s what Margolin, Driscoll, Toland, and Kegler (2013) examined.

So let’s take a look at their research protocol before we explore their findings (yes, I’m going to make you wait to find out whether you’re reading this as well on screen as you would if you were reading it on paper! It’s like another old favorite, There’s a Monster at the End of This Book) and what this means for my school, and my students, and me. The clearly written, error-free article is organized effectively, prefacing the study itself with a thorough review of the existing literature (see below) as well as the real-world applicability of the findings. The authors also firmly locate their study within a theoretical framework of reading, specifically the Construction Integration (CI) model (Margolin et al., 2013, p. 512).

Literature Review: The Holes in the Research
Memory, not Comprehension
Margolin et al. (2013) first established the research landscape on the subject of e-reading and observed a few trends that seemed to create a niche for their query. They observed that the previous research into electronic reading “has examined memory for text … [but] the reading literature has not yet examined comprehension” (Margolin et al., 2013, p. 512). That is, researchers have examined readers’ ability to recall what they read electronically, but not how well they understood it.

Process, not Product
Secondly, Margolin et al. (2013) established that the earliest research into electronic reading “focused primarily on the process and efficacy of reading from computers, rather than outcomes like comprehension and learning” (p. 513). For example, researchers studied the speed with which individuals could read and proofread on paper versus a computer (Margolin et al., 2013, p. 513) and analyzed discrepancies in terms of the experiential and physical differences between reading on screen and reading on paper (backlighting, typographic spacing and fonts, scrolling vs. page-turning, etc.) (Margolin et al., 2013, p. 513).

E-Readers, not Computers
Thirdly, Margolin et al. (2013) explain that previous research has focused on computer-screen reading with hyperlinks (blogs, online news websites, etc.) as opposed to e-reader reading, which more closely mimics book reading and doesn’t present opportunities for readers to click out of the text (p. 513).

Therefore, to fill the hole in the research presented by these established trends, Margolin et al. (2013) “looked to explore a new technology known as an e-reader, whose intended function is the singular process of reading, rather than searching for an evaluating information online” (p. 514). The authors argue that e-reader reading is fundamentally different from computer-screen reading because “there is no need to search or problem-solve to navigate through the hyperlinks, because these are not present on an e-reader device” like my Kindle (Margolin et al., 2013, p. 514). The authors make a very strong case for their study, demonstrating that the research to date has not really explored comprehension with e-readers. Their study is timely, given that e-readers are gaining in popularity among many age groups and contexts. In this manner, the authors make a compelling case for the relevance of this study to the field of cognition, educational psychology, and general reading.

Many schools, including my own, are implementing tablet or e-textbook programs.  However, the authors fail to address that schools are, on the whole, implementing tablet programs, not e–reader programs. The value of this study is somewhat limited by the fact that e-readers are not the electronic device that is leading the way in the sea change that’s happening at schools. It’s the tablet–full of hyperlinks, browsers, and doodads–that is invading the classroom. So, even though the researchers used academic texts like the ones students study in school, this study may not ultimately be as germane to the burgeoning conversation surrounding how students read required texts in classrooms.

Data Collection Method and Research Design
Margolin et al. (2013) recruited 90 research participants ages 18 to 25 from an introduction to psychology course at a Western New York college (p. 514). Slightly less than a third of the participants were male, but the ratio of male to female participants was retained in the three different groups to which the participants were randomly assigned: Paper readers, computer readers, and e-readers (Margolin et al., 2013, p. 514). It is perhaps worth noting that none of the participants had any diagnosed learning disabilities or dyslexia (Margolin et al., 2013, p. 514). (I point that out simply because I’m particularly interested in how our switch to e-textbooks might affect–negatively or positively–our students with those issues. More on that a bit later.) Each of the groups was presented with 1o passages to read: five were expository, intended to “convey facts and information” (Margolin et al., 2013, p. 514), and five were narrative, intended to “tell a story or chronicle an event” (Margolin et al., 2013, p. 514). The researchers ensured that there was uniformity among the texts in terms of length and reading level, as measured by the Flesch-Kincaid grade-level scale (Margolin et al., 2013, p. 515). The participants were randomly assigned to one of three delivery methods: The paper readers read on standard 8.5″ x 11″ white paper, the computer readers read PDFs on screen, and the e-readers read on a Kindle with e-ink technology just like the one I have (Margolin et al., 2013, p. 515). Immediately following their reading of the texts, participants completed a questionnaire in which they answered analytical questions to probe their level of understanding and interpretation as well as questions about their process and experience (did they skip around, did they re-read, did they follow along with a finger, etc.) (Margolin et al., 2013, p. 515).

The organization of this study seems very thorough. While the researchers could have limited their exploration to correlation between media presentation (paper, computer, e-reader) and comprehension, they went a step further to examine the behaviors that readers employ to assist in their comprehension. This gives us a fuller picture of what reading looks like across all of these platforms and makes the conclusion more compelling.

Results and Analysis
The results portion of this article was less clear than some of the other components. More tables or graphs would have been helpful, especially in terms of depicting the relationships among media type, reading behavior, and comprehension. Nevertheless, a few takeaways were immediately apparent. For one, comprehension was found to be slightly lower for narrative passages than for expository passages overall, regardless of how the text was presented (Margolin et al., 2013,p. 516). The comprehension scores for all three presentation styles were comparable and reflected a similar (small) disparity between comprehension of narrative texts and expository texts (Margolin et al., 2013,p. 516). Comprehension was found to be higher for computer readers when they followed along with a finger or silently mouthed what they were reading; however, these behaviors did not improve comprehension for paper readers or e-readers (Margolin et al., 2013,p. 516). Examining each presentation of text on its own, reading behaviors for the most part did not make a difference in the reader’s comprehension of narrative versus expository text (Margolin et al., 2013, p. 516). The only two behaviors that seemed to correlate to significantly higher comprehension were finger-tracking and mouthing (Margolin et al., 2013,p. 516). There did not seem to be any difference in the frequency with which readers relied on most behaviors (highlighting, finger tracking, mouthing, taking notes, saying words aloud) across the three presentation styles, with one exception: When it came to skipping around while reading, the Kindle readers were found to do much less of this behavior than either paper or computer readers (Margolin et al., 2013,p. 516).

The authors acknowledge that the relevance and applicability of their study may be limited somewhat by the relative youth of the study participants. Older readers, who may have issues with working memory, may find that they encounter more reading comprehension problems overall or that the platform of delivery of text does make a difference in their comprehension. The study would have to be replicated with an older demographic before we could whole-heartedly say that reading comprehension isn’t negatively affected by delivery medium. The college students who participated in this study are probably very comfortable with technology by virtue of their age; it could be that we are seeing that reading comprehension is not negatively affected by e-reading as long as the e-reader has met a threshold level of comfort with the technology. It would also be interesting to study young, budding readers across these delivery methods to find out if the way we learn to read is affected by the medium in which we read. This study included people who were proficient readers and proficient users of technology; it might be telling to examine groups on either side of that center swath: young students with less reading proficiency and older readers with less technological comfort and less ideal working memories.

The results of this study appear to be encouraging in a world where “the amount of digital text being created grows exponentially every day” (Margolin et al., 2013, p. 512) and where “reading, and more importantly, comprehension, is a fundamental skill necessary for the successful completion of almost any type of class as well as in the job marketplace” (Margolin et al., 2013, p. 512). Overall, the authors found no significant difference in reading comprehension among the paper readers, the computer readers, and the e-readers. Furthermore, while readers from all groups comprehended narrative texts slightly less well than they did expository texts, this difference was no greater for the computer-readers or e-readers than for the paper readers, suggesting that, narrative reading might simply be more difficult (for people of this age group and experience level) than expository reading. The data also suggest that people engage largely similar behaviors when reading on various platforms, that only finger-tracking and mouthing had any effect on comprehension (across all presentation styles), and that even that effect was minimal. In conclusion, Margolin et al. (2013) argue that “electronic forms of text presentation (both computer and e-reader) may be just as viable a format as paper presentation for both narrative and expository texts” (p. 517).

This is encouraging news for me personally and professionally, and it seems to support what my own “anecdata” has suggested: For me, reading on my Kindle really doesn’t feel any different from reading a book. I still cry in public with alarming regularity. In my role as a teacher, I have always had a policy of allowing my students to read in whatever platform or media they prefer (except smartphones), as I philosophically don’t believe in impeding reading. Reading is so personal, so… sensual … that I cannot bring myself to say to my students, who have 10 or more years of experience as readers, “No, don’t do it that way, do it this way.”

This study makes me feel somewhat better about the decision my school has made to move all of students’ instructional materials and textbooks to a single device–that is, if we can control the potentially distracting functions of their tablets and, in essence, turn them into e-readers. The benefits of such a policy are many. For one, having every text on an e-reader would mean that the days of schlepping a backpack so laden with books that it causes serious pain and even injury are behind us. I believe we have a responsibility as educators to look out for the physical well-being of our students, and sending a 100-pound kid home with 60-pound backpack doesn’t seem right to me. I’m also particularly attuned to the needs of our kids with physical disabilities, however rare they might be. I was a teenager with arthritis, and my school accommodated me by issuing me two sets of textbooks: One for my locker, and one for home use. But that didn’t help on the days when even lugging my U.S. history text to class was painful. And imagine the flexibility for me and my students when it’s discovered that we have an extra 20 minutes in class that we didn’t plan for and we can shift gears to a new essay or poem without everyone running to lockers and back!

What Next?
E-readers vs. tablets? What about phones?
However, the study raises several questions worth further investigation: are there any differences in reading comprehension or supportive reading behaviors when one compares Kindle reading to, say, iPad reading? I absolutely do not–could not–read a book on my iPad. I do read magazines and newspapers on my iPad, and while I think my taste in magazines isn’t exactly vapid (Smithsonian, The New Yorker, The Atlantic, etc.), this is grazing reading–I do it for shorter amounts of time and I allow myself to pursue tangents and offshoots at will). But for immersive, hours-long reading sessions, the backlighting on the iPad would wear out my eyes in an hour. Secondly, the iPad is loaded with bells and whistles that threaten to distract even the most devoted reader. The whole point of an iPad–and the reason my school is shifting to them–is that everything is right there in one place: Everything for good (books for every course all at one’s fingertips, no running to lockers or bemoaning books forgotten at home!) and everything for bad (E-mail! Text messaging! Facebook! Canvas! Photos! A camera! Shazam! The Internet!).

I’d like to see a study that explores e-readers as compared with these all-in-one tablets. Furthermore, I’ve had more than one student ask about reading our required texts on a smartphone. The constrained size of the screen, in addition to the aforementioned concerns about the iPad, compel me to reject that proposition out of hand. But what if research shows that reading on a 6.7-square-inch phone is just as good?

Students with learning disabilities or ADD?
Secondly, I’d really like to see an examination of how reading comprehension and reading behaviors are affected when the readers are using electronic devices and when the readers have attention deficit disorder, dyslexia, or other learning disabilities. As I’ve mentioned previously, we have a significant population of students at our school with these issues, and I’d hate to think we’re introducing a technology that makes school all the harder for them. As Margolin et al. (2013) point out, when a reader is distracted, his or her “capacity for processing text may be reduced, making difficulties with fully understanding the text more likely” (p. 512). How cruel to mandate that a kid with ADD arm himself with a weapon to sabotage his own performance.

What if kids want to go old-school?
I think it’s important to remember that this study suggests that there is no difference in reading comprehension among the three platforms. The obvious interpretation of that is that e-reading isn’t worse than traditional paper reading. But the inverse is suggested, too: when it comes to comprehension, e-reading also isn’t better than paper reading. Whatever the benefits of a tablet policy, they are likely associated with convenience and synchrony among students, as opposed to cognitive or pedagogical benefit (at least as far as reading is concerned). So what will I do in two years when a student says, “Is it OK if I read The Metamorphosis in this first-edition, hardcover edition that belonged to my grandmother instead of on my iPad?” My inclination will be to say the equivalent of what I say now to students who want to bring an e-reader instead of a hard copy of The Metamorphosis: “Sure, yes, whatever you like. Just make sure you have it every day when we need it in class.” But if my school is making this institutional change across the board, will I have the freedom and backing to say that? Or will I have to say, “No, please, I’d prefer you download it on the iPad.” That feels wrong to me. It feels wrong to interfere with a young adult’s established preference for reading medium given that there doesn’t appear to be a comprehension benefit.

That other “R”: Writing?
Of course, any exploration of how the cognitive experience of reading on screen differs (or doesn’t) from the experience of reading on paper invites me to think about reading’s dance partner, writing. How does student writing differ when it’s typed versus when it’s handwritten? I do know that research is ongoing in this area, and I’d like to explore it. Nowadays, a student can draft, revise, submit, and receive feedback on a “paper” without touching any actual paper.

Ultimately, though, this study seems to be a bracing splash of cold water for those educators, parents, and students who assume that reading on a device must be worse than reading in a book. This is a status quo bias and it’s not, in itself, a good reason to eschew new technologies that indubitably have benefits (convenience and allure, to name two). Some research into e-reading suggests that “reading online may be at the very least more complex than reading traditional printed text” (Margolin et al., 2013, p. 513) and that it involves “more than simply understanding what is encountered” (Margolin et al., 2013, p. 513), requiring “that the reader engage in other higher level processing of the material beyond creating a mental representation of the text” (Margolin et al., 2013, p. 513).

It’s so easy to make the mistake of thinking that the practices, habits, values, and institutions we have now are older and more established than they are. When it comes to reading, it’s perhaps useful to remember that “we were never born to read. Human beings invented reading  only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species” (Wolf, 2007, p. 3). We live in a time when innovation and invention take place at a mind-dizzying pace. It might behoove us to recall how we humans have adapted to our own inventions in the past. As Wolf (2007) argues, “reading is one of the single most remarkable inventions in history … Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be shaped by experience” (p. 3).

As Winston Churchill said, “We shape our buildings. Thereafter, they shape us.”


Hoover, W.A., and Gough, P.B. (2000). The reading acquisition framework: an overview. In The cognitive foundations of learning to read: A framework. Retrieved from Southwest Educational Development Laboratory Web site:

Margolin, S. J., Driscoll, C., Toland, M. J., & Kegler, J. L. (2013). E-readers, computer screens, or paper: Does reading comprehension change across media platforms? Applied Cognitive Psychology, 27(4), 512–519. doi:10.1002/acp.2930

Wolf, M. (2007). Proust and the squid: The story and science of the reading brain. New York, NY: Harper.

Wren, S. (2000). The cognitive foundations of learning to read: A framework. Retrieved from Southwest Educational Development Laboratory Web site:


Effectiveness of CRT in Literacy Instruction – An Example of Bias in Research

Cheesman, E., & De Pry, R. (2010). A Critical review of culturally responsive literacy instruction. Journal of Praxis in Multicultural Education, 5(1). doi:10.9741/2161-2978.1034

Article in Brief

The article in question seeks to determine the effectiveness of the Culturally Responsive Teaching model in literacy instruction.  The authors attempt to do this with a wide ranging literature review while also explaining the recent history of education policy and the more common methods of instruction with literacy.


The begins by articulating the urgency with which we must consider intervention on literacy instruction in our schools.  We are educated on the serious risks associated with low levels of literacy development both on a whole society level but on the individual level as well.  Once the urgency is created in the author for the high need for exemplar literacy instruction we are introduced to the most recent history in regards to education policy.  Titled “School Reform Efforts” we learn about the steps that have been taken and a few of the models adopted to close the “achievement gap”.  Beginning with the hallmark reform effort most often known as “no Child Left Behind” we learn of the “valiant” attempts of the Bush administration to help the cause of minority and low socio-economic status students in schools.  The author then turns to two common models that have been adopted to try and close the gap within schools.  Tiered instruction and Culturally Responsive Teaching.  The two methodologies are explained (CRT in a more lengthy manner) and the author moves on to other challenges in closing the achievement gap.

A number of “Cause of Reading Failure” are named in the following section, from behavioral problems to reading disability, the author makes a real case for the challenges involved in promoting learning in literacy.  Next we are taken through the research backed components of effective reading instruction.  We learn of five “established” research based practices that are known to develop literacy in children.

Finally we are taken to our section on the actual effectiveness of CRT in literacy instruction.  After a few paragraphs the author moves towards the implications for future research and the article ends with a brief conclusion.

Contribution to the Field/Implication for Future Research

The authors results in their study limit this articles ability to contribute to the CRT movement however the article is certainly part of the educational research canon.  Literacy instruction is a hot issue in our country and particularly in Arizona where students are unable to move on to 4th grade if they have not passed a 3rd grade reading test.

The article draws itself in sharp contrast to my current efforts in education however it does elicit some powerful questions about the practicality of CRT as well as the difficulty to operationalize the method.

Theoretical Framework

The article poses itself as a literature review of the effect of Culturally Responsive  Teaching practices on literacy instruction.  The article certainly reviews A LOT of literature however not enough of it seems to pertain to the actual act of Culturally Responsive Teaching.  The actual framework seems to be much more based around a literature review of recent education policy and practices as well as what works in literacy instruction but the review of Culturally Responsive teaching literature is lacking.


Data Collection

The data collection is really just an aggregation of a large number of articles and research findings in education.  The authors have reviewed over fifty articles to establish their research topic and findings in this literature review.


The analyses of the effect of culturally responsive teaching seems to come through the lens of two separate research articles.  While the authors utilize over fifty articles and readings to explain the state of literacy and education as well as define what Culturally Responsive Teaching is, we find they analyze the results of very few articles.  In their analyses they find that while Culturally Responsive Teaching is intuitively appealing it does not seem to actually be so when analyzed with a research based approach.  The author then points to a study that actually hypothesizes that an approach that is not culturally similar will promote interracial awareness.  The authors then go on to attack the credibility of various Culturally Responsive research because of faulty terminology in some articles.  The authors go on to say that recommending these practices without further evidence of effectiveness will serve to undermine the great promise of literacy instruction from scientifically-based reading research.


The author concludes by pointing to Dr. Walter J Turnbull, a changemaker in education in Harlem, New York, as an outstanding example of a culturally responsive approach to education.  While lauding Dr. Turnbull on his success the authors question the ability of the approach to be replicated and repeated across different contexts.  In the implications for future research the authors articulate a number of questions about the viability of Culturally Responsive Teaching and the need for more evidence of effectiveness.  These questions include isolating factors within the Culturally Responsive approach, questioning what mindsets are required and pondering how to best analyze the true effects in order to answer research questions.

In short the authors prescribe a heavy dose of future research with strong operationalization and systemization of Culturally Responsive practices so as to catalogue and organize them according to effectiveness and their ability to replicated.

Readers thoughts

I wanted to make some quick comments about the article as it provided strong evidence for comment in relation to the course themes of impact, access and excellence.  From the beginning it seems clear that the authors are not interested in actually discussing the effects and possible benefits of the Culturally Responsive model.  In fact the authors state the correct model about four pages in without ever mentioning the potential of CRT.

I make this note because this article speaks about how structures of power and influence can actually misrepresent theories that empower those that are marginalized.  The authors cite very few articles in relation to CRT and ask for it to be systemized which it is exactly the point of CRT to NOT be systemized.  The CRT method is about reflection, adaptation and evolution not systems and operations that are replicated across communities.  I think that the authors suffer from some sort of industrial complex that does not allow for a dynamic method like CRT to enter the mainstream.  I think it is important to note that an article like this can sway the minds of a good many people because it academically published, however it suffers from some real biases in it’s analyses and presentation!

Analysis of an Ethnocentric Charter School on an American Indian Reservation

Fenimore-Smith, J. K. (2009). The Power of Place: Creating an Indigenous Charter School. Journal of American Indian Education, 48(2), 1–17.

Charter schools have often been endorsed as an alternative to the public school system, as it allows for more freedom in curriculum and instruction while still adhering to state standards.  Specifically ethnocentric charter schools have been employed to address the complex and unique needs and challenges surrounding the educational struggles of marginalized and colonized indigenous populations.  The objective of ethnocentric charter schools is to integrate traditional indigenous linguistic and cultural ways of knowing into the Western educational platform.

In the article “The Power of Place: Creating an Indigenous Chart School,” Kay Fenimore-Smith (2009) outlines her study of an ethnocentric indigenous charter school on an undisclosed reservation in northwest United States.  The article outlines a two-year study to identify and examine the challenges and successes of Eagle High School (pseudonym) during the school’s first two years of operation.  The purpose of the study “was to provide a historical record which could serve as a basis for evaluation of the school as well as documentation and analysis of policies and practices of a fledging Indigenous charter school” (p. 2).

Fenimore-Smith (2009) builds upon other related studies that explore the complexities of the development and implementation of linguistically and culturally integrated curricula in American Indian schools.  The research and initial year of operation of the school began in the summer of 2004.  Although Fenimore-Smith (2009) was unable to consistently visit the school during its first year, she did attend school board meetings and school functions, interviewed staff, and conducted multiple classroom observations.  During her sabbatical for the 2005 fall semester, she regularly volunteered at the school to maintain daily contact with the students and staff.  For the remainder of the school year, Fenimore-Smith (2009) occasionally met with school staff and attended in-serve sessions in addition to other school functions.

The research was conducted through a variety of ethnographic methodologies, such as field-notes on interviews, meetings, and observations.   Other strategies included taping daily journal entries, and collecting school-related artifacts, such as student/parent and staff handbooks, classroom handouts, and school schedules.  Fenimore-Smith (2009) employed the triangulation model of reviewing the data by comparing the information garnered through the taped notes to the interviews and artifacts.  Additionally, the initial analysis of the data was reviewed by the school administrator and student who transcribed the interviews.  As the data revealed several themes, Cummins (1992) theory of cultural differences was implemented as the framework and lens from which the findings were dissected and analyzed. The theory outlines four elements that affect minority student access to education: incorporation of students’ language and culture; community participation in school; instruction; testing.  As Cummins (1992) theoretical framework is based on educational access, Fenimore-Smith (2009) contends that the study’s findings are grounded in practical application.

The findings of the research are significant of the systemic challenges faced by Eagle High School.  Eagle High School’s mission statement of “[Eagle High School] is dedicated to recognizing an individual’s worth and dignity and mutual respect between all people. [It] will provide a new educational environment and unique curriculum to bridge educational, cultural, economic and social gaps” (p. 5).  Although Eagle High School consistently attempted to strive toward the goals outlined in the mission statement, it ultimately was largely unsuccessful due to unforeseen challenges. The findings highlighted that the school did not adequately train teachers to integrate Native language and culture and the language and culture classes were not integrated into curricula.  Furthermore, some students were resistant to participate in traditional cultural activities and language and, as the focus of the curricula shifted, the students requested structured, more Western activities.  Lastly, there was no community and parental involvement, and 79% of the students failed the state standardized assessment.  However, the study did reveal that students felt valued by the teachers and, while the community and parents were not actively involved in the school, they expressed appreciation of the school and its mission.

While the findings are very compelling, there are research methodologies and a claim within this study that should be further addressed.  If there are no transparent or valid research methodologies, the study cannot be duplicated as a means of testing for reliability.  Unfortunately, this opaque approach muddles the validity of the findings, no matter how compelling they appear.

The first element is the fact that Fenimore-Smith (2009) admits to have had limited access to the school throughout the first year of her research.  However, she does not address how limited the access was nor in what capacity.  She also does not discuss if this affected her research methodologies and findings, or if she compensated for the lack of access through the implementation of another approach.

The second element is that there are no explanations of how she completed or how often she conducted her research within the two years of her study.  Other significant and absent research facts are the protocols for the interviews, who the participants were, how the participants were selected, and what demographics the participants represented.  Moreover, she never indicated the objective(s) of the classroom observations or how the observations were equated into findings.

The third element is the vagueness of the school and location.  Employing a pseudonym for the charter high school as well as excluding the name of the reservation serves no real purpose.  However, it does perpetuate the ideology that all indigenous communities are culturally and linguistically identical and, therefore, these facets do not contribute to the unique challenges encountered on each reservation.

The fourth element is the employment of a student-participant to analyze the findings. It is not an objective practice to have the student who transcribes the interviews also interpret the data to provide a Native perspective, particularly if the student may know the interviewees.  This knowledge may alter his/her answers due to inherent biases against the interviewees. Furthermore, there is no indication if the student was a participant in the research or served in other capacities as well, such as also being an interviewee.

The last element is that while Fenimore-Smith (2009) claims that she built her research on other similar studies, there are no explicit mention of any other studies’ findings or how they were conducted.  This statement begs the question of to which studies was she referencing and how did they correlate to her research methodologies and findings.  Explicitly comparing and contrasting research methodologies and findings would have also been another way of ensuring the reliability of the findings, especially if they contribute to the research framework of indigenous educational challenges encountered on reservations.

It is very important, however, to acknowledge that throughout the research process, Fenimore-Smith (2009) addresses her intersectionality and positionality as an outsider to both the indigenous community and the school in which she conducted her research.  She also recognizes that her relationship with the students and staff may have been influenced by their perception of her as a teacher and colleague, therefore altering the data rendered from the ethnographic methodologies.  Fenimore-Smith (2009) notes that she is “fully aware that my interpretation of events may indeed affect ‘the interests and lives of the people represented,’ and it is with this knowledge that I present my findings and as understandings, not explanations” (p. 5). The acknowledgement of her positionality and intersectionality reveal layers that may otherwise be undetectable, inherent biases present throughout the findings.

While teaching in a public school within the heart of the Navajo Nation, I also encountered some similar challenges.  The issue that resonated most with my experiences was that of parental and community involvement.  I had approximately 90-95 students in my 6th grade writing class, but for parent-teacher conferences or “report card parties,” only about 15-20 parents would attend the events. The main issues that prohibited the majority of my students’ families from attending were fiscally embedded.  For example, many of the parents would not attend school events because they did not have enough money to pay for gas, they only had enough gas to go to the grocery store, or they did not have any gas in their vehicle tanks.

Furthermore, many parents were disengaged from their students’ academics due to a myriad of other obstacles.  However, the commonality demonstrated by all the parents was that of multigenerational trauma stemming from the impact of colonization, particularly the boarding school era.  Many parents and community members did not feel comfortable meeting in a school setting due to systemic cultural abuse that was perpetrated by the education system.  Therefore, following the theory of multigenerational trauma, the trauma experienced by the grandparents and parents of my students at the hands of educators in Western schools was instilled in the younger generations.  Thus, not only does this explain some of the lack of parental involvement, but it can also be attributed to the resulting lack of American Indian academic achievement.

Research that explores the complexities of parental and community involvement would be beneficial for American Indian students.  I am particularly interested in learning of any reservation schools that have implemented Epstein’s (2002) triangular partnership model that outlines six types of involvement (Olivos, Jimenez-Castellanos, & Ochoa, 2011). As the model should be tailored to better address the needs of the specific communities in which it is implemented, it would be fascinating to see how it has been adapted throughout various reservations.  As parental and community involvement increase student academic achievement, it is imperative to study different approaches to reach particularly traditionally marginalized and colonized populations.

Olivos, E.M., Jimenez-Castellanos, O., & Ochoa, A. M., (2011). Bicultural Parent Engagement: Advocacy and Empowerment. New York: Teachers College Press.

Subject Selection

Guzey, S. S., & Roehrig, G. H. (2009). Teaching science with technology: case studies of science teachers’ development of technology, pedagogy, and content knowledge. Contemporary Issues in Technology and Teacher Education, 9, 25–45. doi:10.1007/s10956-008-9140-4

This week I looked at an article called Teaching Science with Technology: Case Studies of Science Teachers’ Development of Technology, Pedagogy, and Content Knowledge. (Guzey & Roehrig, 2009)The study is looking at how a professional development program called Technology Enhanced Communities or TEC, enhanced science teachers’ TPACK.

TPACK is a theoretical framework which is derived from Shulman’s idea of Pedagogical Content Knowledge. TPACK is made up of three forms of knowledge: content, pedagogy and technology. The argument is that a teacher must have integration of all three knowledge areas in order to be effective. TEC is described in the article as “a yearlong, intensive program, which included a 2-week-long summer introductory course about inquiry teaching and technology tools.” In addition, there were group meetings throughout the year, which was associated with an online teacher action research course. During the two week course in the summer, the participating teachers learned about inquiry-based activities while learning several instructional technologies.

Guzey and Roehrig did qualitative research and collected data through observation, interviews and surveys. In this study, they chose four teachers, new to the field; all of whom had less than three years of experience.

The organization of the article is very easy follow and read. However, the order of the sections didn’t make sense to me. Guzey and Roehrig put the profiles of the teachers in between the results and the discussion. This caused it to feel disjointed, as it didn’t flow properly. The author clearly explains the theories and gives examples of the research they came from. However, the research is supposed to be looking at the impact of TEC, but I felt that there was a lot of focus on inquiry, which is a component of TEC; none the less, too much focus on it. Additionally, the author went into great length of what TPACK is, but, it wasn’t necessary to understand the theory at the depth provided in order to comprehend the research.

Guzey and Roehrig chose beginning teachers because they felt this would provide more commonalities: they had graduated from the same program, they were all going to be teaching their specialty …etc. However, I totally disagree with this approach to selection. Had veteran teachers been selected, there would have been more focus on the authors’ guiding question rather than on common rookie issues (e.g. classroom management, flexibility, lesson planning). Much of the article discusses these issues, which, while they play a role in being an effective teacher, doesn’t necessarily impact whether or not the TEC program is working. By selecting veteran teachers, much of this would have been avoided.

The analysis gives a pretty clear picture of their work and if the resources were available could be reproduced. In the results section, Guzey and Roehrig stated, “Teachers were each found to integrate technology into their teaching to various degrees.” However, their guiding question was how does TEC enhance TPACK? How can the depth of integration of technology be their result? In the decision section of the article they state that TEC was found to have a “varying impact on teacher development of TPACK.” That should have been in their results. Unfortunately, since new teachers are learning so much more at one time than veteran teachers, I don’t know how reliable these results are. It is doubtful that this research had a big impact within the field, as the findings were not significant.

The impact that this research had on my area of inquiry is a different story. I have been solely focused on how integrating technology will have an impact on student achievement and it never occurred to me to consider the teachers’ experience or effectiveness. If a teacher has poor classroom management, adding technology to the mix is not going to increase student achievement. In fact, it is likely to do the opposite. Managing technology in a classroom adds a degree of chaos. Most veteran teachers are adept at establishing new procedures and have enough forethought to know what those procedures should be. One has to be able to understand what problems may arise with students in order to establish procedures that would circumvent said problems. It is unlikely that most beginning teachers have this depth of knowledge. Additionally, veteran teachers have the ability to adjust at a moment’s notice when technology fails, which it does and will. This again, goes back to experience. It would be like giving a two-handed piano piece to a beginning piano student, who is only ready to play with one hand. Reading two lines of music at the same time, maintaining a steady tempo, including dynamics and phrasing is more than one can expect from a beginning musician, but after a few weeks or months of one-handed pieces, that student will be ready to add a level of difficulty. This is not to say that beginning teachers shouldn’t be using technology, the opposite is true; but, to utilize beginning teachers as research participants in how effective technology is, may not be the wisest decision.

Professional development is not a point I considered as a piece to my research. Often, professional development is a hit and run experience. We receive an hour or two of training and then we, the teachers, are expected to have it completely integrated the following day and we never speak of it again. This could be why so many teachers are so cynical about new programs. As a music teacher, very few of the professional developments I have attended have been catered to me specifically. Due to this, I have spent much time over the last twelve years, essentially providing my own professional development. On one hand I have become quit proficient at innovating within my classroom, but had I received more guidance from a veteran teacher, it would have taken me less time to achieve what I have. Technology is a tricky area, in that some people are very comfortable with daily technology interactions and some people struggle with turning on electronics. It may be necessary to include a professional development component within my action research in order to create support for the teachers I work with. It would need to be implemented in such a way that the teachers are able to reflect and discuss their experiences and brainstorm new ideas. This will create lessons that utilize technology to deepen the understanding of the concept, not just adding technology for the sake of technology. Overall, I enjoyed reading this article, it really got me reflecting on the presentation of my own work and the components I should or shouldn’t include.

Preparing Teachers to be Resilient

Over the past several years I have facilitated countless professional developments for our iTeachAZ Site Coordinators, mentor teachers and teacher candidates, one question that I always ask when I am beginning a session on teaching is, “when you walk out of a lesson that you deem to be effective, what elements have led you to that decision?” Nearly every time I ask that question, participant responses include things like…lessons should be appropriately challenging or students should be a little uncomfortable. These responses, although I am in agreement with them, have always puzzled me. How do you measure the appropriate amount of discomfort or challenge without losing the students’ motivation to stay involved in the lesson? How do we equip students to have the tools necessary to persevere in spite of their desire to want to give up when solving difficult tasks?

In ‘Managing uncertainty during collaborative problem solving in elementary school teams: The role of peer influence in robotics engineering activity”, Jordan and McDaniel (in press), conducted a qualitative study on fifth graders. The study focused on collaborative groups and the role that the groups played in how students responded to content and uncertainty while working on engineering projects. They explain, “managing uncertaintyrefers to behaviors an individual engages in to enable action 
in the face of uncertainty. Uncertainty is a regularly occurring experience for humans. Although it is often a difficult experience to manage, it is not inherently
an aversive state. Individuals are often motivated to reduce uncertainty through various information-seeking strategies” (p.5). Jordan et al. describes that uncertainty (or what I described above as appropriate challenge/discomfort) is a feeling and our natural response is to try to minimize it. Furthermore, they imply that there are strategies that can equip students to persevere and not let the feelings of uncertainty result in mismanagement.

In the study, Jordan et al. emphasize the importance of relationships and the key role that they play in supporting students to work through their uncertainty. They describe various responses that students had while working on the engineering project. They observed interactions amongst the collaborative groups and examined the influence that the collaborative peers had on one another. During one observation, the authors observed a student who wasn’t able to articulate her uncertainty. They noticed that one of the group members began to question, challenge, and explain information to this student to assist her in articulating the uncertainty. The authors noted, “for this peer response to occur, 
a responder had to believe the uncertainty being expressed by his or her peer was at a minimum legitimate, warranted, or reasonable” (p. 20). This response by the authors implies that students need to have the ability to empathize or see things from a different perspective in order to respond appropriately and support their peers. In this instance, for example, what would’ve happened if the peer didn’t have empathy? What effect would that have had on her ability to move forward and persevere with the project?

Empathy, which is an emotional intelligence competency, allowed the peers to respond by willingly supporting the student who was struggling. Jordan et al. echoes this idea and states, “students’ success at managing uncertainty during collaborative problem solving was dependent on the willingness and ability of their peer collaborators to respond supportively. As students received responses from peers, those responses
 acted as negative or positive feedback for subsequent attempts to manage uncertainty” (p. 26). The authors go on to further describe groups that did not have supportive peers and the effects that it had on the group members. They labeled these groups as “not particularly well-functioning” (p. 28).

Reverting back to the question about equipping students with the necessary tools to persevere in spite of uncertainty, it’s clear from the study that cooperative learning played a critical role in students’ perseverance with completing engineering projects. One would argue, however, that the group members, who lacked the emotional intelligence to empathize and support their peers, had an adverse effect on the students’ ability to move forward with the project.

Daniel Goleman (1995) first introduced the idea that one’s social skill, or emotional intelligence (EI), is a great contributor to relational success. There are several competencies that fall under the umbrella of EI including self-awareness, emotional management, empathy, and social competence. Further, Low and Nelson (2006) explain EI as a “learned ability to understand, use, and express human emotions in healthy and productive ways” (p. 2). Both Goleman and Low agree that these skills need to be taught and developed. As I conclude, it would seem that peer influence can be an effective tool, when the students are equipped with the emotional intelligence competencies to support their peers.


Goleman, D. (1995). Emotional Intelligence: Why it can matter more than IQ for character, health, and lifelong achievement. New York, NY: Ban- tam Books.

Jordan, M.E. & McDaniel, (in press). Managing uncertainty during collaborative problem solving in elementary school teams: The role of peer influence in robotics engineering activity. Journal of the Learning Sciences. Doi: 10.1080/10508406.2014.896254

Low, G. R., & Nelson, D. B. (2006). Emotional Intelligence and college success: A research- based assessment and intervention model. In Center for Education Development & Evaluation (CEDER) Retrieved from Texas A&M University-Kingsville website: 1-10. http:// College_Success-2006.cederpaper.pdf




Access, Equity, and Community Colleges


Gilbert, C., & Heller, D. E. (2013). Access, Equity, and Community Colleges: The Truman Commission and Federal Higher Education Policy from 1947 to 2011. Journal Of Higher Education, 84(3), 417-443.


The role of the community college has recently been brought to the forefront of higher education by current President Barack Obama as the United States strives to be a global leader with educational attainment. However, it was the Truman Commission that first brought concerns to Capitol Hill in 1947 with the concern of access and equity in higher education in the United States. In Access, Equity, and Community College: The Truman Commission and the Federal Higher Education Policy from 1947 to 2011, Claire K. Gilbert and Donald E. Heller offer a lens through which we can view and understand the trajectory of U.S. thinking about higher education policy from the end of World War II to the present day (Gilbert & Heller, 2013).

I personally connected to this piece because I have made my career in the community college sector for the last six years. I found some direct correlation to the articles general material and findings, based on recent experience at a professional conference for higher education, in which one of the presenters focused on similar material as the discussion focused access to higher education, and the role the community will play. Many of the topics I read through this article were familiar because of active research and development in my professional role. However, some of the historical information and findings from the authors were new to me, so I found that very appealing. The article did make a new idea for me in regards to research. The way these authors were able to springboard directly off of prior research to focus on what is happening today seemed simple yet essential to their piece. As I evaluate my own potential research methods, this article will be a valuable tool on how to use others research materials to bring credibility to my own. Another thing that grabbed my attention is how forward thinking and innovative ideas can pave the way for impact and change. The ripple effect of the Truman Commission is still being felt today. This article will influence me to strive for change with my own action research project to support access and equity in higher education.


Gilbert and Heller’s research was well developed and organized in its presentation to readers. The article did a great job of first introducing their audience to what they hoped to accomplish with their research. Next the article provided a solid background of the basis for their research, in this case The Truman Commission of 1947. The researchers laid out the initial intentions of the President’s Commission on Higher Education and their intent to review the progress that has taken place in the United States since the recommendations of the committee were presented. The report then concluded with findings that compared the commission’s recommendations against what has been accomplished to date. The article ready very clear and concise while presenting reliable information to engage readers.

Contribution to the Field

This article is important to my existing role as a leader working in an institution of higher education, and it is entirely appropriate to my current area of inquiry as an academic researcher. It contributes to the field of study because of the data and empirical evidence it provides. The author’s findings present detail on a monumental topic in higher education and how this movement affected access and equity in higher education. I also acknowledged strength in the author’s outcomes when they did not hesitate to recognize the shortcomings that still plague the education system in the United States beyond the Truman Commission findings. I found this article extremely valuable to me because it highlights the integration of the community college system and its purpose to help with access to education, which I hope to investigate more.

Literature Review

There were many points of this article that stood out to me. However, the key pieces of information that were most powerful to me is to see how progressive the idea of this commission was for the U.S. in the 1940’s. And also the material presented that shows how far we still have to go with improving access and equity in higher education system in 2014. Before this article I had some understanding of the Truman Commission, but not to the extent I do now. The article did an excellent job of educating me as a reader on the enormous impact this commission had on education policy and the development of the community college system while also guiding me to see the inadequacies of governmental processes in terms of education policy today. The author’s modelled the idea that although the commission paved the way for great change, several years later our country still faces challenges with many of the topics presented in this study.

Theoretical Framework

In reflecting on this article, I feel the author’s presented the reasoning behind their research and report. The article provided useful insight that help frame Gilbert and Heller’s intention to look at what has come about in a way of results in the U.S. since 1947 when the President’s Commission on Higher Education was introduced. The framework carried through the text appropriately presented analysis that supported the authors message that regardless of whether the report has been explicitly adopted into legislation and policy, its ideals and many of its specific recommendations have been incorporated over time (Gilbert & Heller, 2013).

Data Collection & Analysis

The data collection for this article was very clear. Gilbert and Heller used the 1947 President’s Commission on Higher Education report, as the basis to their research. They discuss the original report in great length to help layout the point of their research. They also use a variety of scholarly research findings, state and national statistics on higher education to help support their findings. The author’s presentation of statistics and data to support their findings were essential to me as a reader understanding the progression of the research results. Without some of the metrics being given in the writing, I would have found it hard to see the results in some of the findings being presented. The methods of data collection and presentation of the material seemed very traditional and easy for the reader like myself to follow and potentially replicate in the future.

Findings, Discussion, and Conclusion

The article Access, Equity, and Community Colleges: The Truman Commission and Federal Higher Education Policy from 1947 to 2011 brought some very significant findings to light. Gilbert and Heller were able to make logical connections to legislative and general changes in higher education since 1947. Their research document presented appropriate findings that helped me see some of the progress that has been made in higher education since the Truman Commission. I was convinced as a reader that there were sufficient evidence and findings presented for me to find this reading valuable and important in my research arena. The material presented made good connections to relevant material supported by qualitative and quantitative supporting evidence in supporting their research of change in access and equity with community colleges in the U.S. post 1947.

“Community” for online learning

Sadera, William A.; Robertson, James; Song, Liyan; Midon, N. M. (2009). The role of community in online learning success. MERLOT Journal of Online Learning and Teaching, 5(2). Retrieved from


What are the effects of community in online education contexts, specifically on how students’ perceive their own success?  This is the question tackled by Sadera, Robertson, Song, and Midon in a 2009 issue of the Journal of Online Learning and Teaching.  The authors make a contribution to the online learning literature that has already established community as an important element of online learning, by studying how or what the effects are of community on perceived student success.


The paper’s readability is inhibited by both the lack of clarity in the research focus initially, and the significant typographical errors.[1]  Not until the “literature survey” do readers begin to understand that the focus is on students’ perceptions or feelings of their own success (vs. success as determined by observable factors, such as achievement [GPA or course scores], improvements in academic achievement, or retention), and then in the rephrasing of the study’s purpose in the Methods section introduction.  Otherwise, the study follows in a logical, coherent manner, typical of a report on social science research, i.e. the introduction is followed by an overview of relevant literature, methods, results, a discussion of findings, and a conclusion, inclusive of the study’s limitations and thoughts for future investigation.


Sadera et al’s work is framed by a sociocultural perspective, which guides their consideration of existing research on community and success in distance education.  They organize literature in three relevant areas of concentration.  The first explores how communities among people geographically disbursed are defined.  Commonalities among research studies in this vein indicate that communities involve a “shared purpose and the relationship among them including their sense of belonging, trust, and interaction” (p. 278).  The authors construct a definition of community that seems lacking, given the review of literature just prior presented.  It reads that a community is “a group of participants, relationships, interactions and their social presence within a given learning environment.”  They add that their definition excludes how communities organize and maintain themselves, i.e community is not defined as or by “the collection of technologies used to manage and communicate within the environment” (p. 278).  The weakness of this definition is striking, because what stands out in their presentation of existing literature on communities is attention to “sense of shared purpose” or “shared emotional connection,” “membership” or “common expectations and goals.”  Even a simple, generally applicable dictionary definition explicitly indicates the particular relevance of something shared or common, e.g.: “a social group of any size whose members reside in a specific locality, share government, and often have a common cultural and historical heritage.”  Perhaps this isn’t influential to the research process, but how we define things are so important to our perspective, that it seemed worth mentioning the stated definition’s seeming deficiency.


The second category of literature the authors include confirms the positive relationship between community and perceived student learning.  They site two particular directions here: (1) a study on the importance or impact of community in different courses, which found no significant difference (though the scope of the study was limited – only two courses and in the same field were studied); and (2) sense of community and students’ perceived learning.  For this second orientation, the authors take up the Classroom Community Scale, an instrument designed specifically “to measure the sense of community in an online learning environment.”  (This tool is considered valid, as its reliability coefficient well exceeds the reliability coefficient Cronbach’s alpha accepted as the bar in social science research.)  Its application in other studies has shown a “positive relationship between students’ sense of community and their perceived learning success in online courses.”  The last area of existing research reviewed deals with community and interaction, “especially important in distance education…because it helps reduce feelings of isolation and contributes to the student success in online environments” (p. 279).  There are three types of interaction relevant to this context: interaction between the learner and the content, between the learner and the instructor, and between or among learners.


Data collection was organized around three areas of inquiry:

–        Is perceived learning affected by participation in the online community?

–        How does the sense of community affect perceived learning?

–        Does the amount and type of online interaction affect the feeling of membership in the learning community?

An online survey on a Likert scale was offered to undergraduate students attending an accredited US university, enrolled in online courses.  The authors had an 11.3% return rate on survey respondents, which left them with a sample of 121 participants, characteristically representative of “adult learners pursuing a technical undergraduate degree online” (p. 280).  Underlying survey questions were three objectives: (1) to collect demographic data, including previous experience in online courses; (2) to assess specific efforts to build community in the course, course design elements (including the instructor’s role), and the role of online technologies; and (3) student active participation in the course and community, including frequency of use of online technologies.


Data collection underwent a pilot several months before formal data collection, which contributes to the reliability of their approach.  SPSS was used to analyze data, as well as Pearson’s Correlation to address the three research questions in turn.  The researchers found a significant positive correlation between self-reported time spent on task and learning and their self-reported participation in learning activities and perceived learning.  In other words, the authors found a relationship between student’s active involvement in the online education community (however formed or described) and learning.  Next, they report positive correlation between students’ perceived learning and community (evaluated on connectedness scores).  Finally, their analysis of online technologies to interact found that only email had any significant impact on connectedness or learning.  In sum, the study finds that learner interaction and engagement, sense of community, and success in online learning are strongly correlated.


The authors make note of worthwhile research foci for the future, based upon the limitations of their study scope and their study’s findings.  Primarily, they indicate the importance of future research that asks the same general questions as this study: how community relates to success among online learners.  Research involving different populations (besides adult undergraduate students, comprising the sample of this study) would contribute to the literature.  Studying factors beyond what is specifically associated with the courses in the research scope, including activities a school or the broader environment might undertake to help cultivate a sense of community or elements of course design built with community-cultivation in mind, would support better understanding community and learning in an online environment.  Also, more research is needed on how online learners may experience community in different [types of] courses.  In the literature survey presented by the authors, a study by Rovai and Barnum is mentioned, which looked at students’ experiences in two online courses.  But, since the courses were in the same general field (education), and the overall scope was small, the findings are not generalizable.


Of particular interest for me, pursing the development of a junior/high school online education program, is the finding that email, not other online tools, such as chat and discussion boards, influenced students’ sense of community.  Given that students, in grades 7-12, in the pilot implementation of my program – a blended learning format, not fully online – find the use of email either incredibly arduous or highly undesirable, I am surprised.  This may point to the difference in online communication preferences between today and 2009, when the study was conducted.  Also,  it is likely that the adults in the study, irrespective of the era (acknowledging the rapid pace of technological change and use), use technologies and communicate differently than 12-19 year olds.  Exploring or hoping for future research on how K-12 students prefer to connect and how this influences their achievement is relevant to my work.


Also, I am especially interested in the study’s finding that learners with the experience of at least one online course did not experience community or connectedness in the same way as online learning novices.  The study found that these students seemed to find community at conferences more than in active participation in elements of their course(s) that might lend to a sense of community.  This reminds of the important finding of Liou, Antrop-González, & Cooper (2009) that students benefit academically from communities of practice that may be well outside of their academic environment.  Their community cultural wealth model highlights the importance of communities such as those created by students’ families or localities for student success.  Further investigation on how learners (particulary in grades 7-12) succed academically, in part through their role in and the characteristics of community within their online education context, will be important to my work, and that of online education in general.


Liou, D. D., Antrop-González, R., & Cooper, R. (2009). Unveiling the promise of community cultural wealth to sustaining Latina/o students’ college-going information networks. Educational Studies, 45(6), 534–555. doi:10.1080/00131940903311347


[1] For example, on page 278, the authors refer to the same research conducted by Rovia and Rovai.  Or, on page 279, a sentence that would make the point of the paragraph is left unfinished: “Not only does online interaction impact on students’ sense of community, but it is also found to be related to students’ learning success in.”

Learning Outcomes and Engagement

Strayhorn, T. (2008). How College Students’ Engagement Affects Personal and Social Learning Outcomes. Journal of College and Character, X(2), 1–16.


This article is presents possible interventions to influence student engagement, which then results in student learning. A widely accepted model for identifying change is presented. This model is called I-E-O and was developed by Astin in 1991. In the model, I represents “inputs”, E represents “Environment” and O represents “Outcome”. The model developed by Astin is considered a foundational model in evaluating the impact of planned interventions (or activities) with students. Through the use of data collected in the College Student Experiences Questionnaire (CSEQ, the researcher conducted quantitative analysis to identify potential activities (inputs) that would yield a measurable increase in student learning (outcome). The possible outcomes originated from the Council for the Advanced of Standards in Higher Education. Therefore, the research was attempting to determine appropriate input that would correlate to the desired CAS outcomes.

Literature Review

The literature review focused mainly on the frameworks for analysis, the I-E-O model and the standards identified for CAS. In accordance with the I-E-O model, student learning is the result of inputs and environment. The specific desired learning outcomes were identified from the CAS standards. See the table below in which the researcher categorized the standards based upon desired outcomes. Table 1

According to the researcher, the CAS standards are commonly agreed set of outcomes we hope for students that include categories related to developing effective communication practices, accepting diversity in thought and experience, forming meaningful relationships, and acquiring the ability to think critically.  The researcher also defined student engagement as “’the time and energy that students devote to educationally purposeful activities and the extend to which the institution gets students to participate in activities that lead to student success’ (Kezar & Kinzie, 2006, p. 150)” (Strayhorn, 2008, pg. 6)

Quantitative Research

 The research study is seeking to answer two research questions “(a) Is there a statistically significant relationship between students’ engagement in college experiences and personal/social learning gains and (b) What is the relationship between students’ engagement in college experiences and their self-reports personal/social learning gains, controlling for background differences” (Strayhorn, 2008, pg 2). The researcher is adding to the body of work based upon a possible gap in research in this field.

The CSEQ is administered by Indiana University Bloomington. It is typically used for assessment. It is comprised of 191 items “designed to measure the quality and quantity of students’ involvement in college activities and their use of college facilities” (Strayhorn, 2008, pg 4).   It was administered to 8000 undergraduates attending 4-year institutions.  The researcher used survey data used and identified certain questions thought to correlate to specific learning outcomes from CAS. Component factor analysis was used for the initial round of quantitative analysis. The next step was the incorporation of hierarchical linear regression. In hierarchical linear regression, variables are entered into the data set based up an order determined by the researcher.

Limitations of this research include a lack of detail about how participants in the survey were selected. Also, only 4-year institutions were selected. Community college students might have been included if they had transferred. However, that information was not provided. The initial review of the data can be replicated, since it is available. However, the researcher used assumptions to first, correlate what he perceived to be relevant data points along with the CAS standards and then second, to organize their analysis based upon a possible impact.

 Implications and Future Research

 Based upon the analysis, the researcher concluded that peers and active learning were found most impactful on student engagement. Therefore, programs should consider programs that bring students together and support learning such as peer study groups, peer mentors, social outreach. Since faculty provide the opportunities for active learning, this was further discussed in terms of possible research opportunities that faculty could provide to students. Strayhorn (2004) specifically suggests “programs should be (re-) designed for faculty and students to collaborate on research projects, co-teaching experiences, and service learning activities…” (pg. 11). Future research opportunities might be beneficial in showing how peer and faculty engagement opportunities do correlate to successful student outcomes. Strayhorn (2004) further clarifies this by stating “future research might attempt to measure the impact of engagement on other learning outcomes such as those identified by CAS including effective communication, appreciating diversity, and leadership development…” (pg. 12).

Another possible extension of this research is to incorporate the I-E-O model along with student development theories. Student development theories are theories advisors can use to understand how a student is maturing and growing (Williams, 2007). I mention this to suggest that a student’s phase of development could potentially be an influential factor in how the student responds to inputs and environments.   This is a possible extension of this research and relates to my research field since I am beginning to explore outcomes related to advising interventions. This could include qualitative research alongside the quantitative research analysis. An example would be to conduct interviews to get a sense of whether the inputs suggested by this research lead to different levels of outcomes based upon the phase of the students’ development.


Williams, S. (2007). From Theory to Practice: The Application of Theories of Development to Academic Advising Philosophy and Practice. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web site:


Actively Learning From Each Other

Benware, C. A., & Deci, E. L. (1984). Quality of learning with an active versus passive motivational set. American Educational Research Journal, 21(4), 755–765.

In the article, Quality of learning with an active versus passive motivational set, the authors, Carl A. Benware and Edward L. Deci (1984), questioned whether students would be independently motivated to learn with an active orientation over students of a passive orientation. In other words, if a student has a reason that is intrinsically tied to an internal reward, then that student will have an advantage over another student who is learning ‘just because.’ The study had a control and experimental group. The instructions for each group were relatively the same except the main difference for the experimental group is that they were told that upon their return in one week they will be required to teach another student the material they will be studying and then that student will be given an exam from their instruction – this would be the reward.

After one week both groups returned to the laboratory. The control group were given a survey and were examined on the material. The experimental group were given the survey and then told that they were not going to teach the students but were being given an exam to enable investigators to understand their learning process and how well they know the material. The findings indicate that the students in the experimental group were much more interested in learning the material, enjoyed the participation in the experiment and would be willing to participate further (Benware & Deci, 1984). Additionally, their memorization of the material and conceptual learning was also significantly better.

Comments: Strengths and Contributions

Organization The article is designed so that the reader can deduct that the purpose of the study (the quality of learning) is differentiated by motivational factors behind the learning (extrinsic versus intrinsic). What this research highlighted is that there are many different ways to transfer knowledge to students without creating elaborate and time consuming methods.
Contribution to Field While this is an older study, I feel the finding are critical in developing my action research. Particularly, the motivating factors why students learn in one way versus the other.
Literature Review One study the authors refer to is that of Bargh and Schul (1980). The main thrust of their study, On the cognitive benefits of teaching, (Bargh and Schul, 1980) is when people learn, in order to be able to teach the materials to another student or class, they use different cognitive structures than compared to those who learn just to be examined on the material. Again, while this is an older article, I feel that the way humans learn is relatively as old as time.
Data Collection The data collection method was relatively a simple process of having a control group and an experimental group. Both groups were given an article to read with relatively the same instructions except the experimental group was told that when they returned to the lab (after one week of studying the materials) they would teach the content to another student whom would then be tested on the material.
Findings Upon analyzing the assessments, the findings indicate that the interest, enjoyment and participation was significantly greater in the experimental group. More striking was that the conceptual learning score in the experimental group was markedly exceptional over the control group.

Since the article (Benware & Deci, 1984) was published nearly 30 years ago, I was a little hesitant about its relevancy. However, after reading the article I discovered there was significant value to me. A couple of those reasons being:

  • This study highlighted that “active learning” does not require a significant amount of money or time to produce favorable results, such as described in the article, The flipped classroom: A course redesign to foster learning and engagement in a health professions school (McLaughlin et al. 2014). In the article, the authors describe a major stumbling block being that the initial time investment is significant on the instructor but then diminishes but the time commitment for the lab or teachers’ assistant remains high.
  • The combination of the results from the survey and the examination demonstrated that if students were told they were going to use the material learned (to teach another student) then it lead to greater conceptual knowledge (Benware & Deci, 1984).

This has sparked my interest in possibly reproducing a study of this nature at the College of Medicine – Phoenix. Given that these students are pursuing a career in medicine, some of them simply because they want to help people, this may be a good test to see if their altruistic goals are measurable. Also, one of the core goals within Academic Affairs at the College is “to understanding basic and clinical science, (and) students must learn to engage with people in compassionate and understanding ways” (University of Arizona, College of Medicine – Phoenix, Academic Affairs, 2014, para. 2). I may want to design a study that will measure how to achieve this goal with the use of student taught active learning.


Bargh, J.A., & Schul, Y. (1980). On the cognitive benefits of teaching. Journal of Educational Psychology, 72, 593-604.

Benware, C. A., & Deci, E. L. (1984). Quality of learning with an active versus passive motivational set. American Educational Research Journal, 21(4), 755–765.

McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. a, Griffin, L. M., … Mumper, R. J. (2014). The flipped classroom: a course redesign to foster learning and engagement in a health professions school. Academic Medicine : Journal of the Association of American Medical Colleges, 89(2), 236–43.

University of Arizona, College of Medicine – Phoenix, Academic Affairs, (2014) Retrieved from

Increasing Access to Study Abroad Via a Blended-Learning Experience Model

Slotkin, M. H., Durie, C. J., & Eisenberg, J. R. (2012). The benefits of short-term study abroad as a blended learning experience. Journal of International Education in Business, 5(2), 163–173. 

With the 2013 Open Doors report finding that 76.4% of U.S. study abroad participants are White, there is much discussion in the education abroad field about how to increase access to study abroad for underrepresented groups (Institute of International Education, 2013).  Particularly given that the demographics of higher education participation are changing to include increased numbers of Black, African American, Hispanic,  Latinos, and non-traditional students, institutions like ASU are trying to determine how best to help their study abroad participation rates reflect their institutional enrollment makeup.  The article written by Slotkin, Durie, and Eisenberg details an emerging model that may hold promise for helping other institutions offer study abroad opportunities to these and other traditionally underrepresented groups.

Slotkin et al. describe a short-term study abroad program began in 2011 through the Florida Institute of Technology’s College of Business (FTCoB).  The FTCoB program included a short trip abroad to Madrid Spain in conjunction with an online learning component, thus constructing the blended-learning experience for this short-term study abroad program.  The reasons for the implementation of such a model are primarily because of the unique makeup of the FTCoB student population whereby:

  • 60% of the FTCoB students are enrolled in online programs
  • 39.4% of students enrolled in the offsite campuses identify themselves as minority students
  • 34.8% of online undergraduate students identify themselves as minority students
  • Online students are typically mid-career, full-time professionals pursuing a degree part-time
  • Offsite campus students are predominantly active military personnel, military veterans, or employees of U.S. government contractors and typically adult learner, part-time students

Because of the high level of minority and less-traditional students comprising the FTCoB’s student population, creation of the blended-learning study abroad experience was essential in ensuring its viability and being able to serve the disparate needs of these minority groups.

This article did not carry out a research study per se; instead it utilized a brief literature review to form the basis of the benefits likely to be derived for distance-learning students participating on a study abroad experience and then offered perspectives and discussion based on the experiences of the FTCoB program.  The review of the literature explored studies such as Donnely-Smith (2009) which made comparisons between distance learning and short-term study abroad, “mirroring previous debates held on the efficacy and rigor of online education, academics and administrators question if students can receive the same benefits in a short-term as opposed to mid or long-term programs” (Slotkin et al., 2012, p. 165.)   It appears that the debates about online learning bare similarities to those which are being launched at short-term study abroad programs at the moment.  Slotkin et al. continue by presenting literature (Donnelly-Smith, 2009, Mills, Deviney, Ball, 2010) which suggest that short-term programs can actually be ideal in that they afford students a more structured learning environment and that having an abroad experience to then translate into future career skills is particularly important for business students.  In terms of access, Slotkin et al. also cite literature that suggests short-term programs afford students, who have other obligations such as family or work and who represent minority or low-income populations, the opportunity to participate in an abroad experience that, “provide[s] them with the cultural and academic skills they will need to compete in a global workforce” (Mills, Deviney, Ball, 2010).

In terms of its strengths and areas for improvement, the article is very well organized, providing the reader first with an essential background into the unique demographics and structure of the Florida Institute of Technology and then presenting the authors’ hypothesis based on relevant literature.  The article ends with a succinct outline of the makeup of the program and a recapitulation of the main benefits the authors believe the blended-learning study abroad program afforded students.  Its contribution to the field is strong because, as previously stated, increasing the number of minority and less-traditional students in study abroad is a current topic of interest for the field at large, as evidenced by international education organizations such as the Diversity Network, and the numerous conference sessions at NAFSA: Association of International Educators and the Forum on Education Abroad dedicated to the topic.  However, as the article freely admits, research on the intersection of education abroad and online learning is in its infancy, and so there is much room for growth.  In the case of this particular article, the FTCoB program is relatively new and therefore its numbers too small to form generalizations for the field, however Slotkin and Eisenberg’s arguments are grounded in logical assertions based on the literature and hold promise for future research to be done in this area.

The perceived benefits that Slotkin et al. outline include:

  • Enhanced viability: the ability to pull from the online and offsite campus student populations meant a better chance for the study abroad program to meet its necessary enrollment numbers to be able to offer the program in the first place.
  • Enhanced diversity:  “Of the remote students who constituted the FTCoB study abroad, more than 70 percent identified themselves as African-American or Latino, in stark contrast to the predominantly white and international mix hailing from the main campus” (p. 168).
  • Enhanced experience for distance-learning students: whereas the online students typically only had interaction with university faculty via phone and e-mail, the study abroad program gave them a chance to interact with FTCoB professors, foreign professors, and their peers with many remarking “that they missed the process of contemporaneous discussion with their peers and professors” (p.169).
  • Enhanced relationship with the main campus: by affording online students the ability to have face-to-face interaction with campus constituents, Slotkin and Eisenberg propose that the FTCoB study abroad program may lead to increases in alumni participation and giving rates by this student population.  This is grounded in the claim they cite from Black et al. (2006), “campus visitation may increase the distance education student’s sense of inclusion into the university community at large.”

When I think about Slotkin, Durie, and Eisenberg’s article in relation to my experiences in working with short-term programs at ASU, I am excited at the possibility of further developing the blended-learning model to increase access for these underrepresented populations.  ASU currently does run a few short-term programs that include an online pre-trip module so I will plan to run some statistics to see if these programs have a higher participation rate of students from the ASU Online community or other marginalized populations than do our non blended-learning programs. With 9,612 students enrolled in the ASU Online program (Keeler, 2013), this seems like a sizable population from which to draw upon for study abroad participation and I have noticed an increasing number of online students apply to programs which I coordinate.  I think the possibility for more targeted marketing, advising, and resources for this population is great.

In terms of further study that might effectively build on this area of research, I think that it would be important to examine the needs of distance-learning students and non-traditional students in preparing for an study abroad program.  As I have slowly begun to see an increase in participation on study abroad programs from these two groups, I have noted that they involve different processes in preparing them for the experience.  One example is as basic as course the course registration process for online students.  Currently, there is not an automated process for allowing these students to register for study abroad courses as there is for our regular campus students and there is a lot of troubleshooting that goes on.  Similarly, when working with non-traditional students, they have specific advising needs in light of their family and professional situations that I feel would warrant specialized advising and pre-departure orientation sessions.  It would be very beneficial to learn what other specific needs these populations might have when pursuing a study abroad program.

Donnelly-Smith, L. (2009), “Global learning through short-term study abroad”, Peer Review, Vol. 11 No. 4, pp. 12-15.

Institute of International Education. (2013). “Profile of U.S. Study Abroad Students, 2001/02-2011/12.” Open Doors Report on International Educational Exchange. Retrieved from

Keeler, S. (2013). Record number of students choose ASU. Retrieved June 13, 2014, Retrieved from

Mills, L., Deviney, D. and Ball, B. (2010), “Short-term study abroad programs: a diversity of options”, The Journal of Human Resource and Adult Learning, Vol. 6 No. 2, pp. 1-13.

Slotkin, M. H., Durie, C. J., & Eisenberg, J. R. (2012). The benefits of short-term study abroad as a blended learning experience. Journal of International Education in Business5(2), 163–173.

Reinvigorating our System of Science Education


Thorp, L. & Townsend C. (2001). Agricultural education in an elementary school: an ethnographic study of a school garden. 28th Annual National Education Research Conference. 347-360.

When will educators take a moment to realize that science education has shifted from a foundation of wonder to a system of teacher accountability, test scores, and rigorous scientific curriculum? What would happen to science education if we began placing as much emphasis on wonder as we do on accountability, scores, and science standards?  Laurie Thorp and Christine Townsend (2001) take a naturalistic approach to improving science curriculum by studying the “impact of an agricultural education garden-based curriculum on the students and teachers of a Midwestern elementary school” (p. 348). The purpose of their study was to gain a” phenomenological understanding of the impact of an agricultural education-based curriculum on the students and teachers” (Thorp et al., 2001, p. 348) and address the problem of “declining standardized achievement scores” within this community (Thorp et al., 2001, p. 348). Through a case study, the researchers wanted to accentuate the positive effects of a garden-based curriculum but constantly felt “pressure to demonstrate improvement of academic performance in the design of their research and curriculum” (Thorp et al., 2001, p. 349). Although the gardening movement comes with a large number of benefits for teachers and students, a majority of the studies analyzed by the researchers were “unable to report any significant difference in academic achievement as a result of the gardening program utilized” (Thorp et al., 2001, p. 350). Even with the limitations previously reported by other researchers, Thorp and Townsend continued with their case study and discovered a wide variety of benefits.

Thorp and Townsend (2001) explore our “relationship to the land and what it might offer agricultural educators struggling to engage children in the learning process” (p. 347), by introducing the topic, past research, and the purpose of the study; then discussing the theoretical framework and methodologies; and concluding the article with a case study comprised of rich participant descriptions, a conclusion, and further recommendations. The framework for the research project is built from the past research all the way through to recommendations for action, which provides a rich discussion about implementation of a school garden. Exploration of the agricultural integration in a struggling school is done through a consistent lens of “human development coupled with environmental awareness or connection with nature” (Thorp et al., 2001, p. 349). Throughout the article the researchers develop a theory of how human relationships with nature are a combination of both endogenous and exogenous forces and is supported by qualitative data that is collected throughout the case study.

The methodology utilized by the researchers is “axiomatic to naturalistic inquiry” (Thorp et al., 2001, p. 350) and is process oriented, meaning, “the research design becomes nimble, adaptable and exquisitely finessed to the local context of the study” (Thorp et al., 2001, p. 350). Thorp and Townsend  (2001) use a variety of qualitative methods: interviews and dialogues, participation observations, documents, photographic images, naturalistic data analysis, content analysis; allowing the researchers to analyze a full description of the participant’s experiences. Although a large variety of qualitative data is collected, quantitative student data, such as test scores, would have helped give a full view of the effects of the program implementation. In order to justify credibility of the methodologies, the authors referred to the following criteria: catalytic validity, triangulation, reflexivity, and understanding, which they discuss judges “the quality or validity of phenomenological inquiry by standards appropriate to the paradigm” (Thorp et al., 2001, p. 352). Along the same lines as the data collection methodology, a similar method for analysis was utilized, which allowed for all parties to be involved in the data analysis process and progress to occur throughout the case study. Naturalist data analysis was utilized in order to analyze the data throughout the case study, allowing for self-correction and validation.

An outstanding description of the case study is presented in a first-person narrative that offers a vivid description of the participant’s experiences throughout the study. By using this method of introducing the case study, I was able to relate to the underperforming school and truly see the benefits that the participants encountered. I felt as if the presentation of the study was incredibly powerful, moving, and motivating to an educator who works in a similar environment. A description of the benefits was thoughtfully analyzed, which included an improvement in school culture and pride, improvement in creativity, cross-curricular projects, community connections, and an enthusiasm that test scores could not create. All of these improvements made me think: Are we taking the fun and excitement out of education by focusing on test scores and teacher accountability? Should underperforming schools continue to focus on test scores or begin to focus on the renovation of their school culture?

This analysis of school gardening brought about a wide array of questions about how underperforming schools are approached and how we attempt to improve science curriculum by increasing the rigor. Maybe it is time for our schools to address some of these concerns by “[discovering] how agricultural educators might reconnect students to school via a garden” (Thorp et al., 2001, p. 348) or how other educators can integrate real world experiences into their classrooms in order to encourage student engagement and participation.

This article has increased my knowledge of the integration of school gardens and has motivated me to continue researching the many advantages that such a project could have for my students at the Academy of Math and Science. Some recommendations for research that the authors suggest surrounding this area of study are to utilize emergent design in further research, don’t rush the process, and reflect during all aspects of the research. As far as practice is concerned, the researchers also provided guidance to those who wish to take action, such as, including a volunteer to assist in implementation, involve parents and families in the process, include a Extension Service Master Gardener, and do not allow curriculum to hold you back from implementation.

Overall, the researchers present a wide variety of information and proof is provided that there are many benefits to implementing a gardening program in an underperforming school but there are limitations. An engaging science activity such as this improves school culture and student engagement but does not show correlation with improvement of test scores, which, unfortunately might limit the number of schools interested in engaging in this type of program. I believe that if we can peak a student’s interest then we might begin to see improvements in other areas, such as test scores, which is why I am interested in investigating the long-term effects of a garden program on overall student test scores. This article has sparked my interest and I am going to continue to explore the idea of integrating a school garden into the curriculum at the Academy of Math and Science.




Thorp, L. & Townsend C. (2001). Agricultural education in an elementary school: an ethnographic study of a school garden. 28th Annual National Education Research Conference. 347-360.


SCOR Model: Efficiency mapping educational processes

Supply Chain Council, Inc. (2010). Supply Chain Operations Reference (SCOR) Model Overview          Version 10.0. Retrieved from

When thinking of a supply chain, most people outside of a manufacturing industry don’t quite know what this means. Supply chain, at its most basic form, is essentially the chain of events that lead to the production of a product, inclusive of delivery to a customer. The customer could be anyone from a grocery store to an individual depending on the product. Within the supply chain process, many firms and organizations have different ways in which they approach the production of their product. As our society has evolved, supply chains have begun to encompass other “products” including digital products as well as services (items without tangible products). Supply chain has become a function in firms, large and small, encompassing different processes, procedures and tactics.

With that in mind, many organizations began to develop set guidelines as good benchmarks for supply chain. One of those is the Supply Chain Council (SCC). The SCC was founded in 1969 as a consortium of organizations focused on peer-led research and analysis of the supply chain industry and what best practices could be developed for organizations.  Since 1969, SCC has continued to evolve and grow, including new organizations in their scope and research. One major outcome of the establishment of the SCC was the Supply Chain Operations Reference (SCOR) model, which is the “world’s most widely accepted framework for evaluating and comparing supply chain activities and their performance” (SCC, 2010, p. 2).  Now in version 10, the SCOR model has continued to serve as a benchmark for many firms. Professors within W. P. Carey that teach supply chain have referenced this many times in conversation with our students.  The benchmarking allows firms to easily understand what may or may not be working or what direction the firm would need to go to make a specific process or action work with company strategies, particularly as it may compare to the performance of other firms (SCC, 2010, p. 3)

From a visual perspective, the model looks like:


At the most basic level, the model considers the different stakeholders (supplier’s supplier, supplier, the organization, customer both internal and external and then customers of those customers). This is then broken into the basic functions that go into those relationships and then aligned with planning (SCC, 2010, p. 4). By following this model, SCOR should solve 5 key challenges:

  • Superior Customer Service: right product for the right price at the right time
  • Cost Control
  • Planning and Risk Management
  • Supplier/Partner Relationship Management
  • Talent

By solving those challenges and implementing the model, firms should be able to better launch services or products, have better linked processes to strategies, clearer direction for organizational growth and other benefits (SCC, 2010, p. 4). The model is further designed to better provide metrics and data that can be used to recognize trends and other organizationally important information (SCC, 2010, p. 6).

With the application to education in mind, although we are not producing a product per se, higher education institutions provide an array of services to support individuals through their educational experience. Many of the supply chain challenges listed above are directly applicable to higher education as we seek to provide value-added and outstanding service, cost control to our processes, better planning and risk management, relationship management (from vendors to federal groups to organizations to individuals) as well as attraction and development of talent (whether it be attracting top learners to top administrators and faculty). As institutions, we need to think of how we can improve our own processes and programs in the face of increasing competition, whether it be other institutions, professional organizations or even internally between programs. Competitiveness will not go away and the more that technology evolves, the more opportunity individuals will have to connect to knowledge from institutions in different parts of the world.

In direct application to graduate business programs within W. P. Carey, I see a correlation between some of the founding principles of this model and process improvement. Business schools are continuously evaluated on their performance, their outcomes and their rankings. It is a continuous battle to remain as a top tier school so approaches that could help add value to what the school can do become of the utmost importance. The biggest challenge here will be aligning this model to processes and strategies as well as getting buy-in from all key stakeholders who must help make change happen.

In other posts, I have reviewed different models and approaches to how supply chain principles could have value for higher education institutions. The SCOR model sets a good baseline for where supply chain models can go and in what ways they may add impactful improvements to a process chain. Supply chain models seem to reflect that as many elements of the SCOR model seem to crop up in other models that exist.

Overall, the document comes across very technical as you would expect for a process model guide but the design and integration aspects are easily displayed and explained to help firms understand. The document, being a founding model, does not reference other work but does build upon its own growth through its versions by continuously improving upon the process. What is most helpful about the document is that it does come at the approach from a general perspective. Although no application examples are provided using an industry, I feel this will allow firms to best think of how this could work within the organization and the strategic direction of that organization.

The SCOR model presents a good catalyst for higher education organizations for consideration in change. By looking outside education to other industry examples, higher education may find innovations that were not considered before that allow them to create sustainable, innovative, creative and engaging processes, experiences and organizations. Doing so should offer the opportunity for continued growth and success.

Bolg 3; Caring Classrooms

Research Blog Three

Battistich, V., Solomon, D., Watson, M. & Schaps, E. (2010). Caring School Communities; Creating a Caring Community in the Classroom. Published online; Routledge. 08 Jun 2010.

Strengths, Contributions and Ways to Improve; Graphic Organizer

Organization: The article was well organized and well written. The argument was developed and the analysis was informative.

Contribution to Field: The article’s contribution to the field was meaningful and significant.

Literature Review: The article did not provide a literature review.

Theoretical Framework/Lens: The article clearly demonstrated coherent theoretical framework. The research focused on creating a community of caring in the classroom.

Data Collection: Data was collected from The Child Development Project, as well as various sources.

Analysis: The article had a philosophical impact on current education action research.

Findings: The findings of the video were inconclusive however, the research does outline some assumptions about culturally relevant pedagogy and its meaning for intercultural learning.

Discussion/Conclusions: The article provides a formula for creating a successful classroom environment.

Minor Editorial Comments: No editorial comments for the article.

Miscellaneous: No miscellaneous comments for the article at this time.


Caring School Communities; Creating a Caring Community in the Classroom

The article; Creating a Caring Community in the Classroom discussed school change and intervention aimed at enhancing students social and ethical developments. This article was selected to advance my understanding of the culture of a great classroom environment. Furthermore, the subject is directly aligned with the week four theme of “leadership and innovation within action research”. What can be understood by researching prosocial development, providing students with opportunities to reach academic and social goals and provide meaningful pedagogy is the crux of this article.

The research for the study was conduct with teachers at three elementary schools that implemented the program over a 7-year period. The research team evaluated the program’s effectiveness by following a longitudinal cohort of students in those schools and in three other similar schools.

The major findings from the article were; schools differ greatly in the extent to which they can be characterized as caring communities. Second, school community is significantly related to a large number of desirable outcomes for both students and teachers. Third, a particular set of classroom activities and practices is related to the sense of community, with student behaviors.

How does the article relate to my own experiences? This article was very specific to my community of practice. As a former middle school teacher I was constantly seeking new ways to create a sense of “community” in the classroom. The data and research in the article was robust, in that it offered an analytical approach to building a successful classroom environment. This is a prime example of impact. As I reflect on my own experiences as a middle school classroom teacher, the research reminded me of the on-going efforts to create and maintain a professional learning community.

The Impact of the article on education research?

In my opinion, this article had a profound impact on my education action research. As a researcher, I find myself looking for new information and new methodologies to add to my bucket of knowledge. I chose this article because it provided me with a better understanding of classroom culture which is directly aligned with my area of inquiry. The article offered fresh perspectives and good insight that will benefit my area of inquiry in various ways. I hope to use this research to support my argument and to report a fresh perspective on an age old are of concern, the classroom.

The Misappropriation of College Retention Programs

inLove, B. J. (1993). Issues and problems in the retention of black students in predominantly white institutions of higher education. Equity & Excellence in Education, 26(1), 27-36.

Barbara J. Love (1993) takes a strong look at retention issues in her article, Issues and Problems in the Retention of Black Students in Predominantly White Institutions of Higher Education. Published over twenty years ago, this article presents solid information about Black student retention in White universities and factors that cause Black students to drop-out prior to graduation. As a means for future study, this article provides a historical perspective on the issue of Black student retention which can be compared to recent literature on the topic.

The goal of Love’s (1993) article is to identify issues in retention programs that are not traditionally addressed in Predominantly White Institutions (PWIs). For years, graduation rates for minorities students, specifically Black students, have been dismal in PWIs. Historically, Black students graduate approximately one third less frequently than their White counterparts (Love, 1993). As a means to remedy the stagnation of Black graduation rates, higher education institutions created significant retention programs to address attrition issues without significant results. However, Love (1993) identifies a research gap between what Black students identify as factors causing them to drop, and what PWI institutions identify as retention issues. Accessible literature showed that most retention programs focus on changing the student and their behaviors, while failing to examine issues of institutionalized racism (Love, 1993).

Using James Meredith, the first Black student to be admitted to the University of Mississippi, as an example of the growing number of Black students who enroll in White institutions, Love (1993), reveals that more students of color are now enrolled in college than ever before, yet there are still low graduation rates. The U.S. Census Bureau data showed that 34% of Black high school graduates  attended college in 1976 dwindling down to just 27% in 1983 (Evans, 1985). Additionally, more Black students enrolled in junior or community colleges rather than in four-year institutions (Love, 1993). In 1985, Blacks comprised only 12% of the U.S. population, yet represented only 8% of undergraduate students. PWIs admit nearly 80% of Black college students; however, only 60% of those students received Bachelor’s degrees from those institutions (McCauley, 1988). The drop-out rate for Black students is eight times higher than White students enrolled in the same institution. Love (1993) presents this data to show the discrepancy of Black students enrolled in PWIs as compared to those who actually complete their degree pointing toward a “revolving door that cuts short the promise of educational equity” (p. 28).

Love (1993) draws on Marvalene S. Hughes’ (1987) article, Black Students’ Participation in Higher Education, in which Black students enrolled in Historically Black Colleges and Universities (HBCUs) described factors that contributed to their success. Students reported feeling welcome and comfortable in the learning environment in HBCUs. Students attributed the ability to “hang out” with other Black students in their major and residence halls as factors in their comfort. Additionally, students felt at ease talking with professors and staff making connections to student and academic services accessible. On the contrary, experiences for Black students at PWIs are quite different. Black students typically find themselves ignored in classrooms, blocked from campus social life, and harassed by campus police (Noel, Levitz, & Saluri, 1985). Love (1993) goes on to say that “Black students in PWIs must be strong self-starters who are fully independent, with strong defenses to combat stereotypes, fears, alienation and loneliness” (p. 28). Although retention programs have been implemented to improve Black graduation rates in PWIs, however, none address institutionalized racism as a factor in attrition. Love (1993) lists seven categories of factors contributing to Black student retention:

  • White racism: overt and covert systems of racial prejudice, bias, and hatred toward Black and other students of color, resulting in the loss of opportunities or advancement
  • Institutional leadership: strength of administration to recognize and combat racism within the institution
  • Finances: awareness, and availability of financial support through government funding, or personal or familial finances
  • Social interaction, cultural dissonance, and environmental incongruence: intra and interpersonal relationships with other students in the institution; the divide between the student’s personal culture and the university culture; the capability of the university to respond to the student’s needs, goals, and aspirations
  • Faculty-student interaction: how students feel toward White professors, and comfort level in asking for additional instruction, advice, or information
  • Student services: the awareness and friendliness of dining halls, residence halls, gyms, counseling services, and student work positions
  • Student characteristics: student’s familial and academic background, self-image, self-esteem and “locus of control” (belief that either internal or external factors decide one’s fate)

Love (1993) uses a study by Noel, Levitz, and Saluri (1985) entitled, Increasing Student Retention, in which the authors evaluated several college retention programs examining the factors mentioned above. They found that no program addressed issues of racism or leadership within the institution, and the majority of programs were focused on student characteristics as the main factor of Black student attrition. Love (1993) concludes and recommends that retention programs in PWIs must address the full range of retention problems affecting Black students rather than concentrating on factors institutions feel most comfortable addressing. White institutions should develop programs to eliminate racism by examining policies, practices, and individual attitudes of students and faculty, which may have an effect on the student’s course load, academic major choice, satisfaction with the university, and overall performance (Love, 1993). Finally, training for institutional leadership should be required for the efficacy of all retention programs. The administrations of PWIs are traditionally comprised of White men who attended White institutions themselves during an era where Black enrollment was not a topic of interest. Such training enables institutional leaders to understand and recognize racism in order to provide access and educational equity.

As stated previously, this article is quite dated; yet, it provides significant historical data that will allow me to compare factors in Black student retention in decades past to current factors. Love (1993) uses a clear, concise writing style makes each section of the article understandable and purposeful; there is no uncertainty in the content. She introduced the article by discussing the disparity of Black student retention, which immediately caught my attention, before moving into significant factors and accessible literature on the topic showing cohesiveness within the content. Additionally, Love (1993) shows no trepidation about the issue of institutionalized racism; a topic typically avoided and deemphasized in higher education research. This article is essential for my research because it focuses specifically on retention programs and the lack of recognition of racial factors in Black student graduation rates. The most intriguing point of this article for me is that Love (1993) takes issue with placing responsibility on the student to manage their own educational experience, and that retention programs focus on the student’s personal characteristics and their ability to integrate themselves into the university culture; an immense, and unbearable task for marginalized students. I plan to explore the area of student responsibility in retention and integration in PWIs within my own research. Additional areas of research could be to compare successful Black students at PWI’s to those who are unsuccessful using information from such research to improve current retention programs. Also, the connection to “locus of control” and retention should be examined to see if Black students typically feel that external factors such as campus climate, student services, faculty, social and academic organizations are ultimately responsible for their experience at White institutions. Overall, Love’s (1993) article compliments my research initiatives by providing uncomfortable, yet important information on how retention programs have failed Black students, giving me a foundation to explore current issues and trends in minority student retention.


Berry, B. (1983). Blacks in predominantly white insitutions of higher education. In J. D. William, The state of black america (pp. 295-318). New York, NY: National Urban League.

Evans, G. (1985, August 7). Social, financial barrier blamed for curbing Blacks’ access to college. Chronicle of Higher Edcuation, 1-15.

Hughes, M. (1987). Black students’ participation in higher education. Journal of College Student Personnel, 532-55.

Love, B. J. (1993). Issues and problems in the retention of black students in predominantly white institutions of higher edcuation. Equity & Excellence in Education, 26(1), 27-36.

McCauley, D. (1988). Effects of specific factors on blacks’ persistence at a predominantly white university. Journal of College Student Development, 45-51.

Noel, L., Levitz, R., & Saluri, D. (1985). Increasing student retention. San Francisco, CA: Jossey-Bass.

PBL + SL = A Successful Developmental Learning Community

Butler, Alison  & Christofili, Monica (2014). Project-based learning communities in
developmental education: A case study of lessons learned. Community College Journal of Research and Practice, 38:7, 638-650. doi: 10.1080/10668926.2012.710125

For this week’s readings, we were assigned Michelle E. Jordan’s and Reuben R. McDaniels, Jr.’s article focusing on managing uncertainty during a collaborative activity.  The paper documented students’ attitudes and perceptions toward this style of teaching.  This article reminded me of my own experiences with project-based learning (PBL), as well of much of the literature I have read over the years about PBL.    Consequently, I decided to focus my research review this week on the efficacy of project-based learning in a community college developmental classroom.  Below is a recent summary of an article in the Community College Journal of Research and Practice.
Article Summary

The purpose of Butler and Christofili’s study is to further examine the relationship between project-based learning and service learning, within the context of a developmental education learning community.  The goal of the study was to “help instructors avoid some of the pitfalls that arise when forming and implementing PBL and to help instructors implement successful PBL” (Butler & Christofili, 2014) by strategically designing the project.

The study was conducted at a large urban community college in the Pacific Northwest: Portland Community College.  The study focused on four learning communities involving developmental courses (math, reading, English and college success).  Furthermore, the study examined a learning community over the duration of four terms/semesters.  The first two semesters of this study involved developmental education students, a learning community, and project-based learning.  The second two semesters introduced service-learning into the learning community (Butler & Christofili, 2014).

The researchers documented each semester’s learning community in the following manner: project question, project implementation, competency assessment, and lessons learned.  Overall, the researchers provided conclusions regarding the design of PBL, with a service-learning component, integrated into a developmental learning community.  Specifically, projects must be of proper scope, instructors need to be flexible given all the potential moving parts, projects must be relevant to learning in respective courses, and managing student group dynamics must be purposeful and strategic (Butler & Christofili, 2014).

Strengths and Critiques

The strength of the article is the practical application to designing learning communities within a community college environment.  The authors provide tangible recommendations to design elements and strategies to integrate service-learning into a learning community.  The authors provide a solid literature review, that includes references to many studies and articles that illustrate the efficacy of learning communities and service-learning for community college students.

The overall research described in the study is lacking.  The researchers reviewed student feedback and their own experiences as both researchers and the instructors.  Overall, I expected greater emphasis of student voice in the research, but this was not evident. I found no evidence that students were interviewed to determine their attitudes and perceptions.  Furthermore, I did not find evidence that all the instructors across the four disciplines were interviewed either.

The authors make many claims regarding the success or failures of the respective learning communities, but do not clearly describe the evidence for which those claims are based.  For example, the authors state that the story theme of the third term project generated “overwhelming student buy-in”  (Butler & Christofili, 2014).  But, I did not find evidence as to how the researchers came to this conclusion.  The majority of the authors’ conclusions are based on their observations of the students and review of students’ projects.  However, I question the strength and objectivity of this case study analysis as both authors were also the instructors of the program.  I appreciate the perspectives of the instructors; however, I believe the research could have been enhanced with a third-party observer/researcher interviewing students, observing classes, and reviewing final projects.

I was very disappointed that this article did not include persistence data  (students enrolling in the next semester and remaining at the college) for the students participating in the learning community.  The article would have been strengthened with more quantitative data.  The only statistic provided was that the retention rate for the third term was higher than previous terms (Butler & Christofili, 2014).  Statistics, as we have discussed, do not tell the whole story.  But in this case, I believe evidence that a learning community designed in this manner could lead to a) increased retention, b) increased persistence, and/or c) higher percentage of course success is vital to other instructors or community colleges adopting this type of instructional model.

Consequently, I offer the following suggestions to improve this study:

  1. Utilize an observer who is not an instructor;
  2. Provide data as to the success, retention and persistence rates of the respective co-horts;
  3. Provide evidence for the conclusions and assertions that are made; and
  4. Focus more on student learning outcomes and impact on the community organizations involved in the service-learning component of the instruction.

My Take

Despite the reservations I have regarding this case study analysis, I am excited about how this article relates to my current role at GCC.  I have been charged with launching our service-learning effort at the college.  We have had pockets of service-learning offered by faculty in various disciplines; however, we do not have a coordinated effort which supports faculty in these endeavors.  Furthermore, I do not believe we have an understanding across our college that service-learning is and can be a meaningful instructional strategy that promotes learning.  Most individuals, when talking about service-learning, tend to focus on the service; the benefits to the community organization and how participation in service-learning improves students feelings and perceptions toward school.  This article, though, reminded me of the need to emphasize that service-learning can and does improve student learning.  And, the article sparked in me an interest to learn more about the impact of service-learning on the developmental student population.  I would venture a guess that the majority of service-learning programs in community colleges across the US focus more on high achieving students (possibly a research question to explore….).  But, this strategy has proven to have a positive impact on developmental students. Ultimately, I am now rethinking how we roll out our service-learning initiative.  Possibly we target a range of interested faculty across multiple disciplines, with developmental education students being a focus. This may prove to be a strategy that positively impacts our success rates, while also emphasizing the role we play as a college in our community.

Another take-away from the article is that instructors struggled to build accountability into their group projects.  I am continually surprised at how frequently this comes up as a challenge for instructors.  Designing effective collaborative learning experiences is challenging. Instructors need to plan extensively to build individual and group accountability into the course for all students involved.  Repeatedly, the instructors indicated how students were upset at how some of their classmates did the majority of the work, while others students apparently did less.  This has always been a challenge of collaborative learning, and there are a lot of articles and guides developed to assist faculty in developing strategies to make sure students are accountable for the work of the group, as well as their individual role within that group.  This article serves as a reminder that additional professional development is probably needed locally at GCC to provide faculty with the skills and strategies to design meaningful and effective collaborative learning experiences.

Finally, I have a renewed sense of excitement around the benefits of learning communities and service-learning in developmental education.  And, this renewed excitement may inspire me to focus my research efforts in this direction.

“Stars” Transition Program

Berlin, L. J., Dunning, R. D., Dodge, K. A., (2010). Enhancing the transition to kindergarten: A randomized trial to test the efficacy of the “Stars” summer kindergarten orientation program.  Early Childhood Research Quarterly, 26, 247-254.

My area of interest for research and innovation is in the area of the transition period that children experience from home to kindergarten or from preschool to kindergarten.  Since the start of my studies this summer, I have read many articles in this area that have focused on the importance of successful transitions into kindergarten.  I have learned many practical ideas for implementation that would help support what research deems as the best practices in the area.  In my mind, I have started to apply what I have learned to the context of my own school and community.  I started asking myself, given my school and community demographics, strengths, and needs, what would a successful program look like for the students and families we serve?

I came across a research study conducted that researched a kindergarten transition program that mirrored the type of program that I can see being funded and implemented in my own school and community.  Berlin, Dunning & Dodge (2010), developed a transition program called “Stars” that was designed to help students with primarily their social transition into kindergarten.  The program focused on pre-academic skills such as pre-literacy and pre-numeracy, but mostly the focus was on school routines, the social aspects of kindergarten transition, and parent involvement (Berlin et al., 2010).  The program was held for four weeks in the summer prior to kindergarten.

Berlin et al., (2010) found that participation in the “Stars” program eased children’s’ social transitions as judged by kindergarten teachers.  When the children had the same teacher for kindergarten as they did in the “Stars” program, the significance was even higher (Berlin et al., 2010).  Although  there was not a significant effect in the area of academics, the researchers did remind readers that the focus was not on the academic piece, bur more on the social aspect of kindergarten transition.    The study also found that when compared to peers that did not participate in the “Stars” program, children that did participate in the program had an overall better ability to adapt to kindergarten expectations and routines (Berlin et al., 2010).  In further analysis of the results, the researchers in this study also noticed that the positive effects on the “Stars” program were more pronounced for girls compared to boys.  They attributed this effect to the possibility of greater male vulnerability to social stressors (e.g. Zaslow & Haynes, 1996) and teachers’ differential relationships with preschool age girls and boys and/or unmeasured processes (Berlin et al, 2010).   They also noted that the same gender effect occurred in previous studies, such as the Perry Preschool Project, Abecedarian, and Early Training Project (Anderson, 2008).  Although it is interesting to note that the same findings were not true with two recent and well know studies in early childhood transition.  These studies were the large-scale evaluation of the Early Head Start Program and the NICHD Study of Early Child Care (Berlin et al., 2010).

The methods of study and the findings of this study have helped me to think about my plan for innovation in my local community in the area of kindergarten transition.  The study authors noted in their conclusion that they felt that they could see benefit by having the study repeated but on a larger scale (Berlin et al., 2010).  The researchers felt that perhaps the smaller sample size limited their ability to use certain data gathering materials as well as limited the exploration of a wider range of moderated program effects.  Berlin et al, (2010) also recommended the use of more qualitative measures such as parent, teacher, and student interviews and questionnaires.

I can see the value in using these suggestions in my own research.  I believe that of given district support, I can implement an innovative, research backed program in many of our 59 elementary schools.  Although I am not sure what size samples are deemed acceptable for a larger sample size, I feel that I may have the opportunity to use a larger sample size in the South West area of my district.   Based on this study, I also think that it would be interesting to add a deeper qualitative research approach to capture the dynamics of the transition in regards to parent, teacher, and student feelings about their experiences.


Anderson, M. A. (2008). Multiple inference and gender differences in the effects of early intervention:  A reevaluation of the abecedarian, Perry Preschool, and Early Training Projects.  Journal of the American Statistical Association, 103, 1481-1495.

Berlin, L. J., Dunning, R. D., Dodge, K. A., (2010). Enhancing the transition to kindergarten: A randomized trial to test the efficacy of the “Stars” summer kindergarten orientation program.  Early Childhood Research Quarterly, 26, 247-254.

Zaslow, M.S., & Haynes, C.D. (1986). Sex differences in children’s responses to psychological stress: Toward a cross-context analysis.  In M. Lamb, & B. Rogoff (Eds), Advances in developmental psychology (pp. 2890337). Hillsdale, NJ: Erlbaum.