You take the good, you take the bad

Reference

Pinto Alicea, I. (1995, Sep 15). RESEARCH: The state of Hispanic education. The Hispanic        Outlook in Higher Education, 6, 10. Retrieved from http://search.proquest.com/docview/219213390?accountid=41434

Pinto’s (1995) article “The Hispanic Outlook in Higher Education” focuses on Hispanics and teaching positions in higher education. The numbers are growing with Hispanic women making the most progress with regards to becoming full-time faculty at colleges and universities (Pinto, 1995, p. 1). The author points out the good the and bad with regards to Hispanics working at higher institutions as members of the teaching faculty. The numbers continue to trend upwards in terms of progress being made, but the author goes on the immediately compare this progress to drop-out rates.

Pinto references an American Council on Education report. Pinto (1995) here describes some of the data found in the report: “The report also found that the Hispanic drop-out rate in 1993 was 27.5 percent, nearly four times the rate for whites. Hispanics comprised 29 percent of all dropouts even though Hispanics account for only about 12 percent of the 16- to 24-year-old population” (p. 5). Factoring in the rising population rates of Hispanics in America, this data is troublesome. As their population grows, much too large a portion of it is dropping out of schools. This is not trending well to say the least, and though major strides are being made, again, especially in terms of Hispanic women in higher education institutions, the overall data needs to begin to reverse itself or slow dramatically. Still, Pinto does a nice job of describing the heartening trends of Hispanic women in education. Pinto (1995) writes, “Latinas accounted for much of the increase in doctorates conferred to Hispanics. In 1993 alone, the number of doctoral degrees received by Hispanic women jumped 12 percent compared to 2.9 percent for Hispanic men. Although Hispanic men continued to earn slightly more doctorates between 1983 and 1993, the number of doctoral degrees awarded to Hispanic women increased at a faster rate than that of Hispanic men” (p. 10). The implications of this study need to be examined further. To discover the reasoning between the positive inroads of Hispanic women in education and apply it to all Hispanic students would be a great start. In terms of these numbers, there must be something useful that can be gleaned from the positive data sets. I’d like to think that it’s not gender specific and that something universal can found, upon analysis of the data, which can then be turned around to help the whole of the burgeoning Hispanic population.

Organization

This was not a hard article to follow though it was not organized with any sort of guiding headings. It read more like a block of text, but it was organized using paragraphs to convey that ideas were shifting or evolving. Still, the author was definitive in her writing, especially at the onset of the piece. Her sentences were definitive and non-esoteric and related directly to the data regarding college enrollment gains of Hispanic students along with the data on high school drop-out rates amongst Hispanics as well.

Contribution to Field

This article is important to my overall research, and it contributes to this field of study because it elucidates data, and it finds experts in their fields to analyze and support the data. The trends the author discusses are important. The growth of female Hispanic students as college graduates and as college professors is a trend to keep an eye on. Pinto (1995) writes, “Interestingly, Hispanic women made the most gains; the number of bachelor’s degrees they earned from 1981 to 1992 more than doubled, and the number of women earning first-professional degrees nearly tripled” (9). I would like to point out, though, that this article is from 1995. Because of this, there is so much more data to look at. One reason I chose this article, though, is that I can use it to show growth from some of the more recent college graduation rates of the last few years. Also, Pinto does provide solid reasoning for the trends in education which were valid in 1995 and are still valid today.    

Literature Review

Pinto’s use of literature supports her overall article and her data analysis.  She cites reports from the American Council on Education and uses quotations from the council’s director of minorities, Henry Garza.  I think both of those in concert strengthen any and all arguments that she makes.  I do feel like the numbers she presents speak for themselves but to then add in a council director as an interrupter of the results is even better.  Pinto (1995) quotes Garza here: “The study captures the status of Latinos in education,” Hector Garza, director of ACE’s Office of Minorities in Higher Education, says of the results of his organization’s 13th Annual Status Report on Minorities in Higher Education. “It gives us a measure from which to judge our success rate. Latinos continue to make progress but still have a long way to go to reach parity and the education goals for our community” (p. 2).  I’m happy, again, to have found this report, not because it features the most up-to-date data but because it will serve as a baseline in terms of showing progress or a regression juxtaposed next to more recent reports.  Pinto also cites the U.S. Department of Education, the U.S. Bureau of Census Current Population Reports, and the National Center for Education Statistics.  Additionally, Pinto cites Ricardo Martinez of the Hispanic Association of Colleges and Universities.  Pinto (1995) quotes Martinez here: “We are deeply concerned about the pre-collegiate drop-out rates” (p. 2).  I also think using college success rate data compared to high school drop-out rates provided me an interesting contrast and gave me a clear picture of the status of Hispanic education during this time period. Congress are not helping us in this regard.” A major factor in the drop-out problem seems to be the language barrier. This is something second-language learners will perpetually have to deal with; however, being cognizant of this fact has helped educators develop early intervention strategies to better serve these types of learners.

Data Collection

In terms of data collection, Pinto utilizes the aforementioned reports, but she does not only use the data exclusively from 1995.  Pinto cites report data from as far back as ten years prior.  She discusses data trends from this time period and covers this time period in depth.  Also, with her data collection she uses authorities in the field to comment on the data.  I feel that this only buttresses the arguments that she makes.  It would be one thing if she takes a data set and comments on it herself, but when she cites ACE data and then supports it with an ACE director, her arguments are much more effective.

Analysis

Pinto’s analysis in her article takes key data points found in these studies and make them accessible to the laymen.  She also does a really nice job of sub-dividing the trends found in this report.  For instance, Pinto (1995) writes this regarding the types of advanced degrees earned by Hispanic students, “The most popular category was education, where Hispanics earned 211 doctorates in 1993, followed closely by the social sciences, with 182 degrees awarded” (p. 10).  This is an important trend to note, and it’s one that I’ll follow up on when I look at some of the more recent reports on college degrees earned by Hispanic students.  She did go on to write that engineering was least popular of all advanced degrees (Pinto, p. 10), and I’d like to see if this trend has continued on to this day.  I wonder if something was enacted to support Hispanics in their pursuit of degrees more related to math and sciences because of the results of these studies, and I’d like to see if this measure was successful.

Since this article was written in the nineties, I’d like to see the results of increased doctoral degrees amongst Hispanics.  I wonder if Hispanic educators twenty years ago produced another generation of Hispanic students pursuing degrees in education, or did their influence produce the scientists and engineers of this generation?

Theoretical Framework/Lens

Pinto’s roles in her article are myriad. She’s a reporter, a researcher, a data collector, and a cheerleader. She’s an advocate and a critic. She’s a well-wisher and a chider. I feel emotion in her data analysis; I feel pride, and I feel like she’s disappointed. Mostly, I feel like Pinto is someone who cares about the Hispanic population and foresees a bright future for Hispanic students.

Findings & Conclusions

Ultimately, as previously mentioned, Pinto’s conclusions center around foreseeing constant growth for young Hispanic learners. Still, she sees a need for more intervention to better serve Hispanic students. Here, Pinto (1995) quotes Garza once again, “”We need a national plan,” Garza says. “For the Latino community as a whole, we have made progress in college enrollment and graduation rates, but we continue to have a problem with the drop-out rate and in K-12” (p. 3). Including this quotation amongst the data shows that Pinto has concluded that there is much work to do with regards to this issue. Pinto has found two sets of data with regards to Hispanic college students: advanced degrees are on the rise, especially with Hispanic females, but drop-out rates are far too high for Hispanic students attending high school. To use Garza’s quote shows that Pinto acknowledges of the success of rising college graduation rates but knows that there are many more goals still to be accomplished.

 

Boys vs Girls vs Computers

Dixon, F., Cassady, J., Cross, T. & Williams, D. (2005). Effects of technology on critical thinking and essay writing among gifted adolescents.  The Journal of secondary gifted education, 16(4), 180-189.

 

Article Summary

My area of interest is critical thinking, technology, and pedagogy.  The pedagogy I hope to focus on is Bloom’s Taxonomy.  I want to create a study where all three meet in my fifth grade classroom.  This study interested me because it is about critical thinking and technology and part of the assessment that they used to measure the critical thinking focuses on analysis, synthesis, and evaluation which are the higher level of Bloom’s order so this research fits well with my classroom goal. The technology used in this study is trying to determine if a computer can help students become better critical thinkers when they write.  It is also trying to determine if the students’ gender makes any difference in the outcome.   Researchers decided to determine if technology would have an impact on the writing and critical thinking of gifted high school students.  The research was conducted at a residential, gifted high school.  99 incoming juniors wrote an essay.  39 of those students were males and 60 were females, all were sixteen years old.   One year later, the students were incoming seniors and they wrote a second essay.  The prompts for each essay were based off of the same poem.  Directions given for both essays were identical.  This second time the students wrote their essay, they were randomly assigned to one of two groups.  One group wrote the essay by handwriting it as they had done the first time and the second group used a computer to compose their thoughts.  These essays were also scored by the same two people using the same rubric they used the year prior.   The essays measured critical thinking using a five point scale.  The five point assessment measured the critical thinking skills of analysis, synthesis, and evaluation and did not focus on the mechanics of writing.  To score those, two people were brought in and trained until an interrater reliability was established.  A second critical thinking assessment was also used called the Watson-Glaser Critical Thinking Appraisal.  That was an 80 question test that measured critical thinking a different way than the essays.  The results compared the two scores.   There were two sets of results that this study examined.  One was the comparison of the critical thinking scores and the basic writing indicators.  As far as the first comparison, the conclusions of the study led to the analysis that critical thinking was significantly related.  The other evaluated if gender and the computers were interconnected which was the primary focus of the study.  “A…2 (male, female) by 2 (word process, handwrite) repeated measures multivariate analysis of variance was employed, examining four dependent variables at two points in time (WS-1, WS-2). That revealed statistically significant main effects for gender, method of writing at WS-2, and the repeated factor (time)” (Dixon, F., Cassady, J., Cross, T. & Williams, D., 2005 p.185).  The study found that boys did better using the computer.  It found no difference for girls.

 

 

Strengths and Critiques

This study had several limitations that were not addressed.  First, the sample size was very small which makes the results not generalizable.  Not only does it begin with 99 students, but it looks at the results based on gender and the gender is not split evenly to start with so only 39 boys are part of the study out of the 99 total.  That is before dividing the students for the second half of the study.  The researchers also said that the students were randomly assigned to write either by hand or on the computer.  What they did not specify is if the calculations were purposefully made so that the students were evenly divided down the middle or if the random assignments were made by gender.  If they were not, then there is no way of knowing how many male students used the computer and how many completed their second essay by hand. In that case it is possible the results could be very skewed.  Even if the students were evenly divided by gender, that would have left only 19 boys in one of the groups and 20 in the other which is a very small group.   Another issue examine is that this research specifically states that it is done with gifted students.  The authors cite the lack of research in the field of gifted education.  Doing research specifically for the gifted community is a valuable contribution to the field.  However, in addition to repeating this study within the gifted community, because of the small sample size, it might also be a good idea to have another study in which this is attempted in the non-gifted population as well to see what those results show.  A additional study with special education students might also prove worthwhile.

 

One more issue that was not addressed was the amount of word processing skills or familiarity that any of the students had with the computers.  We don’t know how often these students had access to the computers they used for the second essay and if that could account for any of the disparity.  If, by chance, the males had access to the computers more often that might be a contributing factor to their increased scores.  We also don’t know if they were ever given word processing classes, how often, if they had the same access to it as the females, etc.  Other unknown factors that could have impacted the study were the students’ connection to or interest it either of the prompts or to the poem.

 

The overall organization of the article was good and the literature study was detailed.  Several articles were given to support the connection between critical thinking and writing as well as articles supporting the use of computers to assist in writing.  There were no editorial errors.

 

 

Connecting to Past Research

As a classroom teacher who teaches writing, I think that this study has some interesting promise.  Writing fluency is an important component to being a competent writer.  More research needs to be done to explore the effects of using the tools that are available.  Not only should larger sample sizes be used but other types of research could be explored.  For example, what types of hardware (tablets, personal computers, etc.) would help versus hinder the writing process.  Will word processing programs get in the way of students’ writing because they become too encumbered with the minutia of spell-checking and editing rather than focus on the bigger picture of concepts? If technology is available and can help students with the writing process then it is definitely something that should be explored.  If it is something that is prone to help one gender more than another, that is worth examining as well.  If the findings from this research that males are able to write significantly more fluently by using computers is proven valid then the ramifications could have an enormous impact on the way we help students learn.  The technology is readily available in many classrooms and if the expectations become that we allow boys the access to do their writing on technology, and as a result, they become better able to communicate their thoughts, it could have a great impact on their ability to reach academic excellence.

 

Furthering My Area of Interest

This connects to my research in that it opens my mind to the type of technology I might use with my students.  This study made me realize that I don’t have to use something extravagant in order to be effective and have an impact on student learning.  I had been searching for a specific technology “tool” to use but it is now something I am starting to rethink.  In one of the research articles I read, they named a precise Microsoft program (Hubbard, J. D. & Price, G. 2013) that they used. After reading this research, I am reconsidering the direction to take. My goal is to use technology to help students become better critical thinkers based on sound pedagogy.  The background piece for this research began by explaining that laptops are in students’ hands every day which is why they chose to do this study.  The impact of their study alone, even with its issues, has made me consider letting some of the students in my classroom use computers to see if it will help them become more fluent writers.  This has made me contemplate that when I choose my study it would be more helpful to my students, and to anyone who reads my study, to have technology that is a part of their daily live rather than an isolated piece.

 

 

Hubbard, J. D. & Price, G. (2013). Cross-culture and technology integration: Journal of the Research Center for Educational Technology (RECT) 9(1).

External Factors on Organizations

Scott, W. R., & Davis, G. F. (2007). Organization of the environment. Organizations and organizing: rational, natural, and open system perspectives. Upper Saddle River, N.J.: Pearson Prentice Hall.

Summary

Scott & Davis (2007) in  Organizations and Organizing: Rational, Natural, and Open System Perspectives describes how an organization is a product of its environment. They draw the analogy to Darwin’s Theory of Evolution, stating that only the strongest or most adaptable organizations survive. Chapter 5, “Organization of the Environment”,  describes how organizations are developed, how new organizational populations are created, how organizations are shaped by political and social influences, how organizations take on different forms by the demands of their populations, and how organizations strategically respond to all given demands.

In creating organizations, there is usually a meeting of minds. Often times these minds are of similar cultures, ideologies and ethnic backgrounds (Scott and Davis, 2007). Along with these individuals, it is highly likely that an external organization will take part in the birthing and nurturing of a new organization through different tips for success or through financial backing (Scott and Davis, 2007).

Most developing organizations use an existing target population as similar business. They have similar products and similar marketing strategies to their competitors and market to the same audiences. A great example of this would be a large grocery or department store. In targeting populations, new organizations often use current “imprinting”, which refers to a given set of noms and conditions that a population is already used to – for weeding out certain sections of a given population (Scott and Davis, 2007).

Once an organization is established, it fights for the limited resources in is subject specific arena. From populations to products, organizations are constantly shifting and molding to different external forces. According to Scott and Davis (2007), organizations have three types of frameworks for dealing with the social, political, and cultural strains on an organizations. These frameworks are regulative, normative, and cultural cognitive. The regulative framework is the most visible and easy to change element of an organization. It is often able to respond quickly to the demands of external factors. Normative is the day-to-day operations of the organization. This sections is slightly slower to respond to demands and often takes some getting used to by both the public and internal employees. Finally there is the cultural cognitive framework. This framework usually lies at the heart of the institution and is usually the hardest and slowest element to alter. It is traditionally made of the values and beliefs of the given organization. Often, the only way of changing this framework is through a complete reorganization of the institution, the disassembly of the organization, or through a merger where interests shift (Scott and Davis, 2007).

Finally, organizations often have to give into specific demands of their environment. For example, organizations will often respect the given cultural celebrations and customs of it given environment. By appeasing these customs, an organization shows that it is integrated into its given community. Scott and Davis (2007) also describe that organizations are bound by political factors and regulations outside of their control. For example, there are specific rules and regulations that all business have to abide by for public safety and social cohesion. If organizations do not follow these policies they risk being fined, sued, or shut down. Finally, Scott and Davis (2007) depict organizations as servants to the given resources and economic factors. Organizations will often times have to change strategies quickly to keep populations interested, purchasing, and happy. For example, with the economic downturn, many organizations had to lower their prices, lay off employees, or create new products in order to stay afloat.

Strengths and Critiques

The book chapter is clearly organized. The chapter opens with a brief introduction about organizations and their environments. It then discusses organizations from their conception, to their struggles, and concludes with their demise or triumph. This makes it easy for the reader to follow the theory and almost provides a visual about the life of an organization.

The contribution to the field is not very strong as it provides little information on how to build an organization, however, the article does provide insight into the initial success of organizations and provides a glimpse of why certain organizations fail or succeed. By providing a brief look at the stages of an organization, one is able to use this structure to reflect upon their specific institution to begin to understand the shortfalls and successes  of the organization due to its external environment.

Scott and Davis (2007) do not have strong data collection methods. Most of the information is built upon other research theorists within the given field. The strength of their theory relies upon  controlled situations and explanations of how basic commonalities are often seen in common organizational history.

Scott and Davis’ (2007) findings seem convincing as they use logical examples to explain and describe their theoretical ideals. It seems like their findings could be used as a reflective tool for both researchers and administration in an organization.

Personal Use

Compared to other readings that I have observed about organizational theory and development, Scott and Davis (2007) focus on the influences of external factors on an organizational environment rather than looking at the specific structure of the organization. This chapter developed a new research perspective that I can look at when developing a growth plan for the Academic Affairs Department at the College of Medicine – Phoenix. When looking at external factors, I need to consider the cultural, political and social impacts on the department and the university as a whole. I need to think of the budgetary factors involved as the main investor in the university are external persons. Although the budget is allocated by upper management at the university, the budget is determined by state legislators. This political factor could potentially affect a growth plan that I develop.

Pop up books do not support emergent literacy!

Chiong, C., & DeLoache, J. S. (2012). Learning the ABCs: What kinds of picture books facilitate young children’s learning? Journal of Early Childhood Literacy, 13(2), 225–241. doi:10.1177/1468798411430091

Summary

Chiong and DeLoache (2012) explored the question of “what kinds of picture books facilitate young children’s learning?” (p. 225). In the emergent literacy phase, which consists of children from zero to four, many children acquire literacy skills through the interactions they have with their parents and other caretakers. What the authors of this study wanted to know was whether the books being used in these interactions actually helped students in developing literacy skills. In order to explore this question, the authors conducted two studies with children ranging from 30 to 36 months of age. Children were given a normal children’s book without manipulatives and others were given a book that had them. The results of the first study showed that children acquired fewer letters with books that contained manipulatives, compared to those who read the standard books.  In the second study, when manipulatives were directed toward actual letters themselves the researchers wanted the participants to know, there was no noticeable effect. Therefore, this study showed that manipulatives in books are a distraction to getting children to acquire literacy skills.

Organization

The organization of this article was very easy to follow. The article started with the abstract and an overview of the research out there followed by a summary of the first study, then the second and a discussion/conclusion. What was most effective about the organization were the subheadings present under each major section. For example, under the main heading for Study 2, there were subheadings that labeled the participants, materials, procedure, and results/discussion. Additionally, there were very clear images pasted directly into applicable areas, such as examples of what the different books they used looked like. I like that I did not have to go look in the appendix for this; the fact that the book examples were there allowed me to think about these images as I continued reading.

Contribution to the Field

This study gives early childhood researchers and educators an idea of what ineffective books for developing early literacy skills look like. From their work, we know that pop-up books are less effective in getting children to master the alphabet than your standard 2D book.

Theoretical Framework

The framework that this research is based on relates to that it is generally agreed upon that parent facilitation of book interaction in the early years is crucial to developing early literacy skills.  It is accepted that learning the actual alphabet names and letter sounds in conjunction with one another is a best practice.  The researchers cited a meta-analysis of the research about early literacy “that interactive shared book reading was associated with increased expressive vocabulary, especially for two- to three-year-olds” (Chiong & DeLoache, 2012, p. 226).  However, it is still a large debate about how to best teach children how to read.  Therefore, the researchers tried to further investigate the issue around how the content of the book that children interact with.  According to previous studies, “the nature of the pictures with which [children] had been taught influenced how well the children performed in the tests” (Chiong & DeLoache, 2012, p. 227).

Data Collection Methods

In the first study, 48 children participated. Children were given three alphabet books, one that is standard, one that had 3D manipulative elements and one that was the same as the 3D books but the manipulatives were taken out. Children were tested on their prior letter knowledge and parents also completed a survey about how many letters their child knew. Then, an adult reader read the book with the child in which they heard the letters they would be tested on six times. Then, children were given a test on letter naming and on letter recognition.

In the second study, 64 children participated. The procedures were similar to those of the first study. In this one, however, some children were given books where letters were made of sandpaper. The kids with the sandpaper letters were asked to trace the letters and the kids with the normal letters were simply asked to point. Just like study one, they were given a letter naming and letter recognition task immediately afterwards.

Findings

Children performed worse on the tasks when they had the 3D book. This made the authors of the study conclude that manipulative books are distractions. As for students who had the sandpaper letters to trace them, there was no evidence that suggested that this interaction positively impacted their letter naming/recognition. However, with this particular study, the authors concluded that there was no detrimental effect of tracing a letter that had a sandpaper texture.

Miscellaneous

I think that this article would be beneficial for any parent to read. It is written in such a clear language that I think it could be accessible to many parents outside of academia. It made me think about my own self and my process in selecting books for kids. I always explore the children’s section of bookstores for my students and for my niece. I now am going to look at the ‘cool’ kids’ books at Costco and Barnes and Noble much differently. Essentially, this article sends parents and educators a really simple message: to look closely at what the actual goal of the book is. The goal should be to build a child’s literacy skills through practicing reading a certain set of words or letters. However, this article demonstrated that those things that may make the books seem ‘fun’ are actually just distractions and do not help kids meet the true goal of the book. Obviously, pop outs, manipulatives, etc. found in kids’ books makes the book more sellable, which is I am sure why it is done. However, it is our responsibility as educators to inform everyone we know about this problem in our children’s literature so that parents can focus their energies on books that will actually increase access and excellence in education.

New ideas this study suggests for my area of interest

I am interested in researching the best approaches to early literacy. In conducting my annotated bibliography, I have been focusing directly on reading curriculum and instruction. This article gave me the idea that my study could involve parents as part of the process. Perhaps I could think about what would the effect of giving parents a workshop on the ineffectiveness of manipulative books be? What if I did the same for teachers?

Further study

I think this research could definitely benefit from looking at what is currently in pre-K and kindergarten libraries. Are the books we are providing our students full of manipulatives? This also got me thinking about supplemental materials we provide our students. A typical activity I have observed in a kindergarten classroom is kids cutting and gluing letters onto a matching letter. Does this mean students are actually learning their letters or are they simply learning to cut and glue? Obviously motor skills such as gluing and cutting are necessary for students to master, but can the fine motor activities we provide for students replace the actual learning of their letters and sounds? I would like to know this to get more insight as to what the barriers are to getting our students to grow in their reading.

Developing Teachers within their Context

Matsko, K. K., & Hammerness, K. (2013). Unpacking the “Urban” in Urban Teacher Education: Making a Case for Context-Specific Preparation. Journal of Teacher Education, 65(2), 128–144. doi:10.1177/0022487113511645

 

Summary

The article questions the status quo of teacher preparation colleges across the country.  In most cases teacher colleges prepare teachers in a very standard way across the country.  The coursework focuses on pedagogical practices, content knowledge and special education instruction.  The article in question focuses on the need for context specific teacher preparation as opposed to a standardized curriculum across the country.   The authors cite numerous studies that examine how the teaching environment can affect classroom culture and outcomes and the authors attempt to study the various context specific teacher colleges around the country.

 

The authors study Uchicago UTEP’s teacher preparation program and the steps they take to prepare teachers in a context specific way that supports them to enter the classroom in the communities close to the school.  Of note about the UTEP curriculum are a couple of things.  Graduates are educated about theories around “funds of knowledge” and the unique perspectives that students bring to the classroom.  Teachers are also required to spend clinical hours in a local charter school to ground themselves firmly in a local classroom experience.   They also focus on two major projects which are the “school study”, a project where students engage deeply in a study of the community and an “interactive read aloud” which gives teachers perspective on the classroom experience.

 

The authors conclude that this context specific design  for a more nuanced teacher preparation program is very valuable for new teachers.  The context based education helps to unpack the “urban” in urban education and dispel some of the biases that new teachers may have upon entering the classroom.  The authors develop a framework that can be used for context base teacher preparation in an urban setting.

 

Review Comments

Organization

The author organizes the article by first giving purpose to their cause of study.  The need for  specialized education for urban education seems obvious as the urban setting requires teachers to be able to adapt their classroom to the students that enter and allow flexibility throughout a school year.  The authors then go on to describe the various context specific teacher programs that exist across the country.  The authors analyze and draw comparisons between these programs to develop a context specific framework.  The authors close the study by establishing their framework and again arguing for the need for context specific teacher preparation programs in an urban setting.

 

Contribution to Field

The article serves to further claims regarding the unique type of teacher skills that are required in an urban setting.  The “urban” teacher needs to be hyper reflective and willing to adapt and learn from the “funds of knowledge” that students bring to the classroom.  The authors contribute to this sect of educational research by building a framework for context specific teacher preparation.  The specific framework serves to inform teacher preparation programs across the country on how they can prepare urban teachers.

 

Theoretical Framework

The framework of this study is actually a case study of various teacher preparation programs across the country.  The authors seek to compare and contrast teacher preparation programs and their varying philosophy to find a framework for future context specific teacher preparation programs.  By doing a case study of the Chicago UTEP campus the authors are able to identify key levers in creating a context specific teacher preparation program.

 

Data Collection

Data is collected through qualitative observations and interviews of and with the candidates at the teacher preparation college.  The authors did an in-depth qualitative analyses of the methods utilized in the UTEP program.  The authors synthesized this data to create a framework for future programs.

 

Findings/Discussion/Conclusions

The authors found that there were key factors that differentiated the UTEP program from traditional teacher preparation programs across the country.  The authors offer a framework that grounds teacher preparation in multicultural education in an emphasis on social justice and equity.  The framework takes important steps to develop teachers in a way that gives them insight into their teaching context from socio-political norms to local community practices.  This is important because it will prevent teachers from making broad and unreal generalizations about their students and their community.  In multicultural education a context specific education is essential to help prepare teachers to appreciate the funds of knowledge that their students bring to the classroom.  The framework that is developed can be used for teacher preparation programs across the country that seek to read teachers for urban communities.

Who is responsible for my graduation?

Christian, T. Y. (Appalachian S. U., & Sprinkle, J. (Appalachian S. U. (2013). College Student Perceptions and Ideals of Advising: An Exploratory Analysis. College Student Journal, 47(2), 271–292. Retrieved from http://essential.metapress.com/content/7781552X33313470

“Advising is viewed as a teaching function based on a negotiated agreement between the student and the teacher in which varying degrees of learning by both parties to the transaction are the product” (Crookston 1972)

This article sought to assess advisor performance as a function of students’ expectations of the advising experience and the student’s own sense of responsibility. The researchers developed a questionnaire intended to identify the appropriate type of advising based upon the students’ needs, perceptions, and level of Locus of Control. The researchers deployed a questionnaire and evaluated the responses. Multiple hypotheses were presented and evaluated. The article concluded with recommendations for future study.

Literature Review
The researchers described two execution styles for advising, prescriptive and developmental (or collaborative). The developmental approach is a collaborative relationship in which the responsibilities of advisor and student are clear. If one has an internal locus of control one will “take responsibility for their actions, achievements, and consequences”. Those with an external locus will not take that internal responsibility, instead focusing on external factors influencing success. The researchers suggest a correlation exists between locus of control and preferred advising approach. “Viewed through this lens, college students with an internal locus of control are likely to prefer collaborative ad- vising while those with an external locus will prefer prescriptive advising as it places the burden of responsibility firmly on the advisor.” The researchers further suggested a correlation that as we age, our internal locus of control becomes more developed and we instead have a preference for collaborative advising.

Minimal literature was provided. Student development theory, as introduced by Crookston, 1972, is presented, but not fully explained. For example, student development theory uses alternative terms (not just responsibility) to describe this process of internalizing responsibility and it is often viewed a continuum. While Crookston (1972) does support the integration of a sense of self-responsibility and ownership, the advising relationship is much more complex. “Historically, the primary focus of both the academic advisor and the vocational counselor has been concerned with helping the student choose a major or an occupation as a central decision around which to begin organizing his life. The emergence of the student development philosophy in recent years necessitates a critical reexamination of this traditional helping function as well as the assumptions which undergird it” (Crookston, 1972). Essentially one factor of developmental advising was considered. However, the responses by the students could have been much more a function of the other factors introduced by Crookston (1972).

The rationale for assessment in advising is lacking in the literature review. Only one factor, time to graduation, is presented. Current research suggests drivers for assessment also include student retention, academic performance, and advising performance management (Teasley and Buchanan, 2013).

Research
The researchers developed and deployed a questionnaire. The purpose of the questionnaire was to investigate which type of advising was being utilized (prescriptive or developmental) and capture the students’ ideals of advising. The analysis was intended to discuss the relationship, if any, which exists between those two factors. The researchers developed several hypotheses: students are currently receiving developmental advising and that as they age it becomes more prescriptive, gender and ethnicity do not influence student perception of advising interactions, GPA correlates to a tendency towards prescriptive advising, students will use their ideals of advising as a foundation for their own assessment.

A 50-question questionnaire was administered to students in multiple courses within the same department. 125 completed questionnaire were provided to the researchers. Class time was provided to complete the questionnaire but it was not mandatory.

There were two sub-scales considered with the analysis, student perceptions and student ideals. Factor analysis was used to evaluate the data obtained. The researchers explained how they employed the factor analysis. Used factors that had a load higher than their preferred level. Any questions with loads lower than those minimums were removed. After analyzing the results, each hypothesis was discussed in terms of the findings.

Limitations. The researchers proposed two limitations; a relatively small sample size was used and all students were from the same academic department. However, other limitations could be considered that no differential was made between faculty advisors or professional advisors. An extension of that limitation is advisor training and development. Finally, multiple factors, as described by Crookston (1927) influence the advising relationship, not just a student’s locus of control.

The source of the questions was not included. The article by Teasley & Buchanan (2013) included that detail and also highlighted the refinements made to the survey with each round of analysis. This research presented conclusions and a tool that has been tested much less than the tool introduced by Teasley & Buchanan. Duplication of these findings or enhancement of the assessment questionnaire is much more challenging based upon this research.

The questionnaire was only administered to 125 students across three courses within the same department.  Researchers included conclusions based upon the hypotheses. A larger sample size and additional classes could further support the application of the findings.  The researchers identified these factors as a limitation themselves.

Future research opportunities. An advising syllabus has become a key piece of advising execution (Trabant, 2006). An advising syllabus is a tool in which advisors convey their responsibilities and highlight the students’ individual responsibilities. I appreciated the concept of what the researchers were trying to capture in terms of the students’ perceptions of their own level of responsibility. The syllabus is intended to convey responsibility and establish a foundation for the advising relationship. A new idea to consider is how an advising team establishes the syllabus and what is intended by it. From there, is it appropriate to also consider assessing correlation between what is written in the syllabus and what is executed by the advisor?

It was interesting to include a psychological perspective in the assessment of advising. Typical advising literature considers assessment based upon generally accepted advising development theories (Williams, 2011). This could be an expansion of that consideration. McClellan, 2011 suggested the use of widely accepted business assessment models. These different approaches add a new lens by which to evaluate student expectations and development, along with advisor performance.

References

Crookston, B. B. (1972). A Developmental View of Academic Advising as Teaching. Journal of College Student Personnel. US: ACPA Executive Office.

McClellan, J. L. (2011). Beyond Student Learning Outcomes: Developing Comprehensive, Strategic Assessment Plans for Advising Programmes. Journal of Higher Education Policy and Management, 33(6), 641–652. doi:10.1080/1360080X.2011.621190

Teasley, M. L., & Buchanan, E. M. (2013). Capturing the Student Perspective: A New Instrument for Measuring Advising Satisfaction. NACADA Journal, 33(2), 4–15. doi:10.12930/NACADA-12-132

Trabant, T.D. (2006). Advising Syllabus 101. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web site:
http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Creating-an-Advising-Syllabus.aspx

Williams, S. (2007).From Theory to Practice: The Application of Theories of Development to Academic Advising Philosophy and Practice. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web site:
http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Applying-Theory-to-Advising-Practice.aspx

 

Bridging the Gap: Neuroscientists and Educators

Devonshire, I. M., & Dommett, E. J. (2010). Neuroscience: viable applications in education? The Neuroscientist : A Review Journal Bringing Neurobiology, Neurology and Psychiatry, 16(4), 349–56. doi:10.1177/1073858410370900

Summary

The premise of this article is that developments in neuroscience can help the field of education, but they don’t.  The authors explore why this gap exists and give three theoretical barriers and two practical barriers to collaboration.

Theoretical barriers.  The first theoretical barrier is that the two fields have fundamentally different goals and those goals are pursued in different ways.  Neuroscientists are interested in the workings of the brain, the architecture of the mind, and how the two work together.  Educators develop pedagogy.  The second theoretical barrier is that the level on which they investigate is different.  Devonshire and Dommett name five levels of investigation:  individual genes/proteins, neurons, functional circuitry (brain regions and circuits), syndrome (all study related to a disorder), and normal behavior.  Most neuroscientists work on the first three levels, while educators are involved with the last level.  Neuroscience hasn’t made much contribution to the study of normal behavior, since most funding comes to the study of dysfunctions.  The “gold standard” is to do research with healthy humans in suitable environments (page 352).  This is a standard that is difficult to attain when dealing with children, schools, and funders.  The third theoretical barrier is translating the content from one field to another.  Teachers are enthusiatic about learning neuroscientists, but the neuroscientists are “cautious…for fear of seeing their work lost in translation” (page 349).  Also, neuroscience can be used to assess educational theories and practices, but it is difficult to use it to create theories.

Practical barriers.  The first practical barrier is that the two fields use a different working language, which creates a need for people who are bilingual.  It is suggested that neuroscientists need to learn to communicate better with teachers and the general public.  This lack of a common vocabulary means that neuroscientists and teachers have different definitions of such basics as “learning” and “research.”  Research design in the two fields is very different–neuroscientists work in labs where they can isolate individual variables, while educators deal with a variety of variables and accept qualitative research as valid.  Action research lacks the stringent controls that neuroscientists demand.  The second practical barrier is finding time and a suitable environment for teachers and neuroscientists to collaborate.

Strengths and Critiques

This article is very well-organized into two main sections (theoretical and practical barriers), and each of these is divided into sub-sections.  This structure, along with clear headings, makes it easy for the reader to follow the line of reasoning.  However, the conclusion misses some of these points in its summary.  Because the article was published in The Neuroscientist and due to the tone of the writing, the audience is neuroscientists and gives thoughts how to bridge the gap between science and society, in this case, educators.  Unfortunately, it seemed to reinforce stereotypical characterizations of both scientists and educators.  I am not sure if this is the fault of the authors or the reader.  The picture painted is one of the scientist working in a sterile lab, unable and unwilling to communicate with outsiders.  Teachers, on the other hand, lack the ability to understand research from a field as stringent as neuroscience, which is “impossible to understand by educators” (page 353).  “Teachers know very little about the brain, and in some instances, their knowledge was not only poor, but actually incorrect” (page 352).

The article is a good read for contemplating why partnerships are not successful, and gives the reader hope that the gap can be bridged.  There are very specific areas for work.

Relation to other readings

I found this to be an interesting follow-up to our class and readings about communities of practice.  One thing that really stood out for me was the discussion of border practices, explained by Wenger (2000, pages 232-238).  Devonshire and Dommett’s analysis of the barriers leads to the conclusion that something must be done to bridge the two communities of practice.  The article is a step in the right direction, enlightening the two fields of the differences between them. Understanding these differences is the first step towards bridging the gap.

Wenger also points out that for collaboration between communities of practice to occur, there needs to be some common ground, as well as areas of real difference (page 233).  Devonshire and Dommett clearly delineate the areas of difference, but leave the question of common ground unanswered.  They even make a point of taking areas that seem to be common ground (i.e., learning takes place in the brain and mind) and change it into an area of difference (i.e., we disagree on what learning is).

I think Devonshire and Dommett would agree with Wenger when he says that the two communities need to find ways to “translate between repertoires so that experience and competence actually interact” (page 233).

Both articles point to the fact that much work needs to be done in order to have a true collaboration between neuroscientists and educators.

Implications for research

I see many important implicatons for my own research.  First, I will need to learn more about the field of neuroscience, trying to educate myself so that I can see through the lens of the scientists.  I will need to undertake what Devonshire and Dommett deem “impossible,” being an educator who is trying to understand neuroscience.  There is nothing I like more than a challenge!  I am now cognizant that words may have different definitions and connotations in neuroscience than they do in education, and that I will have to understand scientific research methods.

They also point out, through many examples, that information found in teacher magazines and mainstream media may be inaccurate.  Two examples given are the idea that a person is right-brained or left-brained (both hemispheres work together) and the fact that there is no conclusive research about the causes of dyslexia, even though policy-makers and teachers ascribe to certain research-based beliefs, while ignoring other research that validates other ideas.

References

Wenger, E. (2000). Communities of Practice and Social Learning Systems. Organization, 7(2), 225–246. doi:10.1177/135050840072002

 

 

Indigenous Students Identify Six Factors That Impact Degree Completion

 

Guillory, R.M., & Wolverton, M. (2008). It’s about family: Native American student persistence in higher education. The Journal of Higher Education, 79(1), 58-87

I went to school in Winslow, Arizona during elementary level and completed junior high and high school in what was then called South Central Los Angeles. My high school education was as substandard as it gets! So much so that I played catch up when I started at the University of Southern California straight out of high school. It wasn’t until my second semester junior year that I figured out how college worked, what resources were available to me, and what I needed to graduate. Much of this transition occurred because I became a student worker in the Liberal Arts and Sciences department and had complete access to academic advisors in my degree program. I also sought work with the local urban Indian social services center where I could work and interact with other Indigenous people and obtain much needed financial and social support. I concur with the Indigenous voices in Guillory and Wolverton’s (2008) research, there are persistence factors that are unique to Indigenous students that ultimately lead to degree completion.

Summary

This qualitative study used focus groups and interviews of 30 Native American university students and the collective institutional voices of state representatives, university presidents, and faculty at three institutions on the persistence factors and barriers to retention and degree attainment for Native American students. Both groups, students and institutional voices, were asked the same questions. The findings indicate contrary ideas between the two groups on persistence and barrier factors. Students identified family, giving back to tribal community and on-campus social support as their most important persistence factors. Institutional voices listed adequate financial support and academic programs as persistence factors for Indigenous students. Barriers were incongruent as well, clearly a disconnect between the groups which has negative impact on Indigenous student degree completion.

Theoretical Framework and Lens for Analysis

After discussing the theories other researchers have used to explain Indigenous learner college retention and degree attainment (Tinto’s Theory of Student Departure, Pascarella’s General Model for Assessing Change, Astin’s Theory of Involvement and the Model of Institutional Adaptation to Student Diversity), mostly, mainstream theories applied to Indigenous learners, the authors explain their rationale for using a qualitative multiple case study-like approach to center Indigenous learners’ voices. The author’s quote Tierney (1990) “what we need now are sensitive studies that move beyond statistical surveys…Rather than research about American Indians for policy makers… we need studies by and for Native Americans about their relationship to the world of higher education” (as stated in Guillory & Wolverton, 2008, p. 61).

The authors used focus group interviews that lasted 90 to 100 minutes to capture the student voices. Students were selected by referral from the university Native student centers. The students also completed questionnaires to obtain background information. The students represented 20 Indigenous nations and were mostly juniors, seniors and graduate students. They all hailed from tribal communities or border towns. Nine students identified as first generation, while 12 students had at least one parent who held an AA, bachelor’s or master’s level degree. From the focus group interviews, students were asked to name and explain the most important factors that had helped them persist in the university thus far as well as barriers that Indigenous students must overcome to be successful at the university. The institutional voices among three rural universities were asked the same questions.

The authors used a within-case and cross-case analysis to examine the similarities and differences among the three universities. The results were examined to determine processes and outcomes that were thematic among all three universities. The authors used the Family Education Model (FEM) as a framework to analyze the results. The FEM is a model developed for use among Indigenous social work students that promotes action and intervention to improve Indigenous learner retention and degree attainment.

Strengths and critiques

This study has implications for all colleges and universities that educate Indigenous learners. These findings reinforce what Roppolo and Crow caution us against, using essentialist notions to create Indian education as opposed to assessing Indigenous learners to better meet their educational needs, as evidenced by the incongruences between students and institutional administration and staff. The article is a major contribution to the field of Indigenous student persistence in higher education as it centers the Indigenous student voice in higher education and offers solutions to improve retention and persistence. The qualitative approach allows a richness that cannot be seen from raw data like graduation and stop out rates. The authors suggest conducting more studies with Indigenous students at different colleges and universities, as there is little research that centers the voices of Indigenous university students especially among urban Indigenous students.

Related research and conversations

The findings in this research are consistent with the case study in the Campbell (2007) article I posted last week where a partnership between Pima Community College and the Tohono O’odham Nation addressed the need for students to maintain family and community ties, receive extra tutoring to meet college level work demands and complete financial student and family support.

As an educator at a community college an assessment of Indigenous learners’ needs seems like the first step in my practice. Researchers suggest using the student voices to plan interventions that honor the importance of family and community along with financial support to make institutional changes. Though my college is considered to be located in an urban environment, the Indigenous communities surrounding us have been impacted by the close proximity to the city. I am interested to learn if urban Indigenous students yield the same concerns as students in the rural universities.

References

Campbell, A. (2007). Retaining American Indian/Alaska Native students in higher education: A case study of one partnership between the Tohono O’odham Nation and Pima Community College, Tucson, AZ. Journal of American Indian Education, 46(2), 19-42.

Guillory, R.M., & Wolverton, M. (2008). It’s about family: Native American student persistence in higher education. The Journal of Higher Education, 79(1), 58-87.

Tierney, W. G. (1990, May). American Indians and higher education: A research agenda for the 90s. Paper presented at the Opening of the Montana Pipeline: American Indian HIgher Education in the Nineties Conference, Montana State University, Bozeman, MT.

An Analysis of a Comparative Study of Taiwanese Aboriginal and American Indian Identities’ Impact on Educational Issues

Cheng, S. Y., & Jacob, W. J. (2008). American Indian and Taiwan Aboriginal Education: Indigenous Identity and Career Aspirations. Asia Pacific Education Review, 9(3), 233–247. doi:10.1007/BF03026713

When approaching an issue or challenge, it is of utmost importance that all perspectives be considered.  One such powerful perspective that may be rendered is through critically comparing and contrasting two seemingly similar groups or ideas.  The results highlight insightful binaries of similarities/dissimilarities and causes/effects.

This methodology of critically and qualitatively comparing two traditionally colonized and marginalized groups is especially beneficial in educational action research. The insights garnered through analysis of two groups can tease out commonalities and differences, but also an understanding of how and why.  One such study was conducted by Cheng and Jacob (2008) in their article American Indian and Taiwan Aboriginal Education: Indigenous Identity and Career Aspirations.

In the qualitative study performed by Cheng and Jacob (2008), standard comparative case study analysis was implemented to dissect the similarities and differences between a high school in Taipei, Taiwan, and a high school in Los Angeles, California in the United States.  The procedure of this case study was segmented into three stages: design stage, conducting stage, and analysis stage.  In the first stage, the researchers identified the research as an exploratory case study.  In this stage, twelve Taiwanese Aboriginal and American Indian students were selected.  The selected students were stratified by ethnicity, grade, and gender.  In the second stage, the researchers devised a survey comprised of identity, education experience, and career aspirations and conducted participant observations and in-depth interviews.  Most of the interviews lasted from 20-30 minutes, but a few talked for an hour about the topics covered in the survey.  During each interview, the researchers wrote field notes as well as recorded the interviews with a digital recorder.  Upon the completion of the interview, the interviews were transcribed and coded for cross analysis.  The third and final stage was the analysis of the data collected from both high schools (Cheng & Jacob, 2008).

It is important to note that Cheng and Jacob (2008) integrated standpoint theory into their development and analysis of the research. Standpoint theory is borrowed from gender studies, a budding investigative field that highlights sociocultural and political systems of biases, oppression, and power. Standpoint theory calls the researchers to account for any bias they may possess throughout the research process that may influence the outcome of the study.

The qualitative comparative study revealed that there are many similar identity and educational issues surrounding Taiwanese Aboriginals and American Indians.  They both are disassociated with their identities due to sociocultural and political oppression and marginalization.  Most of the oppression and marginalization, in both cases, stems from a lack of exposure, engagement, and support in traditional language, cultural practices, and communities. Both groups experience educational challenges in the form of academic achievement that is associated with the disassociated identities.  However, the differences in how these results are rendered are highlighted.

The Taiwanese Aboriginals experience much stronger and blatant oppression than the American Indians.  The Taiwanese Aboriginal student participants reported that teachers and students consistently perpetuate ethnic stereotypes in school through their comments and trivialization of alcoholism and drug abuse. Although the government mandates traditional languages be offered weekly, it is the last language of four that the students are required to learn.  The students also do not want to learn or engage in traditional activities because there are few in the cities. These issues have resulted in academic underachievement (Cheng & Jacob, 2008).

The American Indian student participants reported that they do not experience much racism or stereotyping due to Los Angeles being so diverse and multicultural.  They also stated that in school, they do not receive indigenous education or language courses, but they do not feel discriminated against.  However, they lamented that teachers were not knowledgeable about indigenous cultural practices and beliefs and did not integrate them into classroom lessons.  The students were able to engage in some traditional cultural practices such as powwows, even though they do not regularly visit their tribal communities on reservations.  The language loss is also the result of the students being raised by non-American Indian parents or, if their parents are American Indians, the parents not knowing the traditional languages.  These challenges have resulted in academic underachievement and high dropout rates (Cheng & Jacob, 2008).

Although I have no personal or work experience with Taiwanese Aboriginals, I have lived and worked in the heart of the Navajo Nation for three years.  The results that were rendered in the study were exactly those that I had encountered on the reservation, with the exception of children being raised by non-American Indians.

I ventured out into the Navajo Nation as an undergraduate student-teacher from Indiana University.  It was through the Cultural Immersions Program that I was required to research, learn, and engage in meaningful discussions of Navajo culture and educational issues for an entire year before moving to the reservation.  Once on the reservation, I was overwhelmed but my conceptual knowledge of Navajo culture helped me connect and, through the generosity of those in the community, transform my knowledge into practice.  I was considered a staple in the community after just one year of teaching as the school district in which I taught always experienced high teacher turn-over.  When I asked the teachers why they were leaving, they always cited that they did not understand the students, the environment was too rural, or they did not feel welcomed.  I extended a few invitations to traditional cultural ceremonies and activities, whenever it was respectful to do so, to a few non-American Indian teachers only to be denied most of the time.

I was active in the community, tried to learn the language, and was very respectful of cultural beliefs and practices.  I not only saw these as opportunities to improve myself through broadening my worldview, but also as a means of helping my students connect to the material I was required to teach them. I often pushed myself with the question, “How can teachers make classroom lessons relevant to students’ cultures and lifestyles if they do not engage in them themselves?”  I was surprised when the school district wanted to highlight me as one of the few teachers who integrated cultural and lifestyle aspects into my classroom lessons.  What further surprised me was that I was the only non-American Indian who was trying to make my lessons culturally relevant for my students.  So, when reading the results of the study, I was not confounded when the American Indian students stated that their teachers were not knowledgeable or incorporating cultural relevancy into their classrooms.

Therefore, the question that is raised from the research results rendered is, “How does localized indigenous cultural teacher training impact academic achievement and teacher retention rates in American Indian communities?” Research concentrating on localized indigenous teacher training is relevant in the educational issues surrounding American Indian high school and higher education graduation rates.  If education is made more accessible through culture and relevancy, then the assumed result would be an increase of academic achievement.  Also, if the students are more responsive to classroom lessons, teachers would be less frustrated and over-whelmed, and be more likely to stay in the community.  Retaining teachers is crucial to the long-term academic success of American Indian students because it reinforces the much needed academic and personal support of students.  This research idea is just one more perspective and analysis that must be explored.  Thus, multiple means and perspectives of critically analyzing the cultural identities and educational issues surrounding indigenous peoples is pivotal to their academic success and ultimate self-determination.

Research Article Review: Professional Socialization in Graduate Enrollment Management

In a recent conversation with a few of the senior members of the enrollment management staff for Arizona State University, a colleague said to me, “the odd thing about all of the materials out there about enrollment management is the titles that sound like ‘enrollment management revolution’ or ‘paradigm shifts through enrollment management’; if someone thinks the concepts of enrollment management are new or represent a paradigm shift, they’re already 20 years behind most of the industry.” It set me on an interesting path of reviewing the literature published on enrollment management and, really, something of a historiographical analysis of the field.

My colleague was correct when he said that there wasn’t much new about the idea of enrollment management as a field. Using Google’s NGram tool, which charts the frequency of the appearance of words in literature across time, the phrase ‘enrollment management’ begins to appear in approximately 1975, yielding almost 40 years of material. Interestingly, the frequency trends dramatically upward around the year 1999, begging the question of what drivers might be attendant in that increase.

I was reminded of a frequent frustration I’ve had in reviewing the enrollment management literature, as my reading has not yielded much material on the impact of enrollment management strategy on graduate education. When I came across an article in College & University magazine on the role of Graduate Enrollment Management, I was delighted to see how the work is situated in the field.

EM & Charts: A match made in heaven

Enrollment management: we’re mostly charts. Source: www.wku.edu

In the article, Crossing the GEM Frontier: Graduate Admissions Professionals’ Participation in Enrollment Management, authors Dean Campbell and Jahmaine Smith take on a very interesting topic: the development of professional identity in the field of Graduate Enrollment Management. The article begins with a brief “state of the field” section, wherein the authors describe the blurred line between enrollment management as a set of practices and as a philosophy or a mindset. They also describe enrollment management in a traditional sense as a set of practices that ties together student recruitment, admission, retention, career, and alumni functional groups. The central point of research inquiry questions the process of individual identity development as it relates to the integration of general tasks and responsibilities of the admissions function with the general processes of identity development.

The concept that had the greatest impact for me was the one around professional socialization. This is the idea that supports the general ‘community of practice’ idea that there are internally reinforcing processes that inform identity. There are many components of professional identity: institution, background, communities, professional associations, individual departments, inter-office collaboration, all in addition to the thoughts, beliefs, emotions, culture, and more traditional drivers associated with identity development. But the notion of professional socialization, that knowledge, beliefs, skills, and behaviors are specifically socialized by the different professional communities affiliated with the admissions professional.

The authors detail three components of identity in enrollment management (anticipatory, meaning development, and personal), discuss the details around the role of socializing of structural forces (i.e. institutions, departments, associations), and then the very role of having a field called enrollment management.

There’s some interesting descriptive work in their ‘Methods’ section, focusing on the way they attempted to ensure validity in the qualitative information solicited, the themes identified, and then the development of findings. Interestingly, there’s a great revelation around the source of the labor force in enrollment management being individuals who have worked in some way in the admissions field.

Additionally, there are some interesting discussions of the roles of traditional identity types in enrollment management, how they both reinforce and produce cultural norms and socializing forces. And the conclusion is really around how professional socialization can be used as an effective analytical lens for figuring out how the field works.

Overall, it’s a great article that I think I’ll be able to use in my work. Here are the big takeaways:

  • There’s a fantastic bibliography, that indicates the article is well-situated in the literature
  • The concept of identity and profession being a framework for analyzing a field, is extremely effective, or at least interesting, when I think about the normative role of executive leadership at the college and university level in higher education administration
  • There’s an additional reinforcement to the field with the notion of enrollment management as a set of standard practices, as a set of general styles, and as a philosophy or approach to working.
  • The role of the individual in development of strategy and operational design as professionally socially contingent – excellent!
  • The role of key socializing factors in defining the ability to be successful as an individual (and by extension as a department and as a unit/school)

 

REFERENCES

Campbell, C. Dean & Smith, Jahmaine (2014). Crossing the GEM Frontier: Graduate Admissions Professionals’ Participation in Enrollment Management. College and University, Volume 89 (Issue 3), 3-11. http://bit.ly/1oqLmhM

 

ACT Policy Report – A Guide for Practitioneers

Lotkowski, V. A., Robbins, S. B., & Noeth, R. J. (2004). The Role of Academic and Non-Academic Factors in Improving College Retention. ACT Policy Report, (September 12, 2007), 1–31.

As a practitioner in the field of higher education I am always excited to find innovative theories or programs in practice that will assist me in my work.  I particularly like to find those articles that provide a theoretical background that parallels some of the work that I am already doing.  While this report is at this time a decade old, the concepts and suggestions for implementation are more prevalent to me now in my career, due to my role, spheres of influence and community of practice that I am a part of.

Summary:

ACT has put together a policy report that highlights some of the critical issues retaining to retention and academic performance of college students.  The focus of this particular reports focuses on the roles in which both the academic and non-academic attributes of the university experience have on college students.  The report utilized data that came through the use of 109 different studies that met three criteria.  Studies must: examine relationship between academic and non-academic factors, focus on full-time students at four-year institutions, and utilized standard measures while reporting pertinent information.

The report results breakdown the two focus areas of academic and non-academic into defining factors as follows.

Academic

  • High school grade point average – HSGPA
  • ACT assessment scores – ACT
  • Socioeconomic status- SES

Non-Academic

  • Academic-related skills
  • Achievement motivation
  • Academic self-confidence
  • Academic goals
  • Institutional commitment
  • Social support
  • Contextual influences
  • Social involvement
  • General self-concept

Throughout the report there are varying combinations of the academic and non-academic findings that yield stronger relationships and higher success rates than the others.  What I find important is that there is a strong correlation between the joint program development between academic and non-academic parties to have the greatest chance of creating or improving student retention and success at the university.

Personal Application:

The areas of implementation in which I can utilize this research are in my roles within Orientation and Housing, as well as in the Dean of Students Office.  The researchers propose to start the non-academic areas of achievement motivation and institutional commitment during early start programs.  I see that I have the opportunity to change our practice in orientation of making it more of an opportunity for starting to develop connections and skills for the incoming students, rather than the transaction model that I feel we have now, where students hear some presentations, register for classes and depart until their return in the Fall.

Within the housing area, I would like to see more investment by our hall staff, including student staff, in a mentoring role, emphasizing the social support aspect, during the early start programs. Particularly there would need to be more intentionality from staff to expand their knowledge and relationships with support staff that are here for the summer.  By making the connections with student and support staff we can provide students with an extended period of time getting acclimated and focused on finding academic support.

The final area that I would like to apply the concepts suggested by the article would be within the Residential College Advisory Board that I chair.  We will need to make a focus change from what the original group was formed to do, that of providing program support and a consistency of student expectations across the colleges.  I think this is the area that is most likely to implement something new, using the resources of the faculty, advisors, and student affairs professionals to develop early warning systems, providing more individual attention to students, development of more consistent outreach to students.  This community of practice is one that I have been a part of for 5 years and have developed strong relationships with several of the longstanding members.

Limitations in my practice:

It has been my experience that change comes with resistance, even if there is empirical support to justify said change.  I do not say this to be negative, but as a realist.  The issues can arise when existing programs are producing positive results, but have been at the same level for some time.  Can the results be better?

As I think of implementation with summer programs, I see a struggle in the opposite of my above statement.  There has been no consistency in the early start programs; goals and target demographics have changed year to year, multiple programs have been created from both academic and non-academic areas that are competing for the same students, resulting in some programs not fully functioning because they cannot hit a critical mass to be effective.

Final thoughts:

I will be looking into an update on this report and the test used within the study.  There is an area that I am particularly interested in, the socioeconomic status portion as it relates to the recruitment and success.  I want to see if there have been new discussions about the admitting, or not admitting, students that are pre-identified as having low SES resulting in high need, and the retention rates as it relates to the investment in those students success.

My favorite parts of this report, as a practitioner, are twofold: 1. the inferred practice of collaboration and working inside and outside known work groups. 2. The steps to development provided.  While I will not list them all, I will highlight one that I feel is most important.  “Widely disseminating results from the program evaluation” (Lotkowski, Robbins, & Noeth, 2004, pg21).  From other discussions on communities of practice and roles we play as researchers, sometimes I feel as practitioners we do not share information that can be a benefit to others…just something to consider.

References

Lotkowski, V. A., Robbins, S. B., & Noeth, R. J. (2004). The Role of Academic and Non-Academic Factors in Improving College Retention. ACT Policy Report, (September 12, 2007), 1–31.

 

 

 

 

 

 

Inconclusive research = lame duck

Eynon, R., & Helsper, E. (2011). Adults learning online: digital choice and/or digital exclusion?. New media & society, 13(4), 534-551.

As I continue down the breadcrumb path of research in the general area of my area of inquiry, I’ve decided to open up my research to encompass articles that touch on a range of topics:
adult online learning,
effective practices for online learning,
assessing online learning,
unfacilitated online learning,
facilitated online learning,
adult communities engaged/disengaged in online learning,
various theories and frameworks that relate to online learning,
and, what I’m calling, the catch-all connection to online learning.

If I set a wide enough net, I’m sure to catch onto a guiding research question that makes the most sense for my work and in the mean-time can build up a great arsenal of knowledge in general about what research has been done in online learning and how it was orchestrated.

This particular article would fall under the umbrella of “adult communities engaged/disengaged in online learning”. This study took place in England, Scotland and Wales and aimed to investigate further which adults were engaging in online learning activities. These activities were organized into twelve general areas for fact checking, travel, shopping, entertainment, social networking, e-government civic participation, informal learning and formal learning/training. The researchers wanted to delve further into some of the characteristics that these adults had in common and also what factors influenced them to engage or disengage with different types of online learning. They focuses in on a central divide in the population and really wanted to decipher between voluntary or choice reasons versus involuntary or “digital exclusion” reasons (536).

The researchers methodologies included using the 2007 Oxford Internet Surveys (OxIS) with a sample size of 2350 people. These individuals were selected first by randomly selecting 175 regions in Britain and then from there they randomly selected ten addresses. The participants ranged from 14 years of age and above and they were all administered the OxIS face-to-face.

As the researchers consolidated the results and captured their findings, they divided the disengaged group of respondents into two groups: ex-users and non-users. There were four key reasons as to why these individuals disengaged from not using the internet and they were: costs, interest in using the internet, skills and access. It appeared that ex-users had made the choice to disengage because they were no longer interested in the internet but the non-users highlighted topics of digital exclusion like access and costs.

When it came to investigating why users were disengaging with the internet for learning opportunities, the characteristics of the users did tend to trend but were complex as it often depended on what type of online learning activity was happening. Users that are highly educated, have children over the age of 10 and have high levels of internet self-efficacy were found to be more likely to engage in formal learning activities via the internet. An underlying element that was important for informal learning, was having access to the internet at home. Also, upon analysis of the data set and its trends, began to see pockets of individuals who were “unexpectedly included” or “unexpectedly excluded” (542).

They conclude the research with a statement that this investigation into the engagement and disengagement of users into the internet and online learning is important because it demonstrates that the more information organizations or educational institutions have about a user, the more likely it can provide tailored, differentiated user support to increase the amount of learning activity that takes place.

If I were to turn my critical eye on this article, I unfortunately find more ways in which to improve it rather than strengths. I think one of the greatest strengths of the article is that the content is organized clearly and it really did a thorough job of contextualizing the importance of finding answers for the research questions.

An immediate area of improvement came from the literature review and theoretical framework supports to the research. It truly appears that more time was spent explaining the need for the research than the content and theory that acts as the foundation to the work.

The data collection process in general appeared to be well thought out and good in its randomization of quite a large population sample size. Unfortunately there are a few elements in the methodologies that are missing. Why was the particular OxIS survey tool used and how was it the perfect tool for this research? What questions are on the OxIS? Also, there was no explanation as to the process of how the surveys were conducted beyond “face-to-face” (537). Were they recorded by hand by the participant or the researcher? How many researchers were involved and was their any training needed to maintain consistency amongst the team? And what protocol was used during the survey?

The analysis of the data and the findings were good and very detailed but it almost seemed like the research really didn’t find much. And what was found, goes to support what would seem very sensible. As for the discussion and conclusion, it just seems like the conclusion was that this survey could be done better in the future and next time around they’ll also gather qualitative data, etc. If seems like if that’s your conclusion, then did we find out something important in this study at all? And if it is that there should be more studies in the future, then that’s not much of a conclusion. And since it lacked in conclusiveness, it makes sense that they weren’t able to offer suggestions of what they could do to provide tailored or individualized supports for educational organizations. It also seemed like there was no clear delineation or ability to distinguish clearly between factors of choice and those of digital exclusion.

Personally, I think the researchers have a lot of room for development when it comes to building on this research. I agree with them that a strong next step would be to combine the survey with an interview to capture some qualitative data. I think partnering with one of the informal or formal institutions they identified and surveying/interviewing learners before and after their online learning experience as well as capturing the actions the organization took to: onboard, orient the user to the technology and learning scope, and support their ongoing learning. This data in concert with the other data collected could help paint a more clear picture of who these online learners are, what needs do they have, and is the organization fulfilling those needs.

I think this second round of research could have great potential in providing more access to equitable educational opportunities. If these researchers could really hone in on the factors that exclude learners from online opportunities or even what actions unexpectedly help to include individuals, then that information could be directly utilized by organizations to help support these learners access learning opportunities that otherwise would not “exist” for them.

This article continues to help paint the picture of how much support, thought and detail should go into your writing and research. Learned something important tonight—if you don’t have anything conclusive in your conclusions, then something went wrong!

Which teachers benefit from coaching?

Marzano, R. J., Simms, J. A., Roy, T., Heflebower, T., & Warrick, P. B. (2013).        Coaching classroom     instruction. Bloomington, IN: Marzano Research.

Ross, J. A. (1992). Teacher efficacy and the effect of coaching on student achievement. Canadian Journal of Education, 17(1), 51-65.

The journal article, Teacher Efficacy and the Effects of Coaching on Student Achievement by Ross (1992) illustrates the link between teacher efficacy, instructional coaching and student achievement. The researcher started with the question, “Who benefits from coaching?” (Ross, 1992, p. 62) The study included 18 history teachers in an Ontario School District. The selected teachers had a wide range of experience and demographic factors. The study also followed six coaches that supported the teachers. The identified coaches were highly competent in the area of history and also had a wide range of demographic factors and experience. All eighteen teachers were tasked with implementing a new history curriculum and were provided with three main resources. The resources included the curriculum materials, three half-day workshops and the third resource was contact with the coach. (Ross, 1992, p. 54) The contact with the coach was defined as face-to face or virtual meetings. The minimum requirement for this study was one contact of each between the coach and the history teacher. The district assigned coaches to each teacher based on their physical location. The coaches had their own community of practice to support one another throughout the study.

The study collected data on student outcomes, cognitive skills and coaching.  Student outcomes were measured by a multiple-choice pre and post assessment in the area of history. Teacher efficacy was measured through a self-report from the eighteen teachers. “Subjects used a six-point agree/disagree scale.” (Ross, 1992, p. 55) The researchers collected data on coaching in two different ways, through an interview and a self-administered questionnaire.

Findings from the study indicated a significant increase from pre to post assessment for the student outcome measure. The teachers that had the most contact with their coach had the higher student results. The author also found that the teachers who had higher self-efficacy, had a higher frequency of interactions with the coach and higher student achievement results. Overall the “investigation found that all teachers, regardless of their level of efficacy, were more effective with increased contact with their coaches.” (Ross, 1992, p. 62) One of the surprising findings from this research was that teachers who had the most principal contact had some of the lowest student outcome results.

The discussion portion of the article was a strength. The author revisited the driving research question and how that was answered by the study. In addition, Ross (1992) also shared three hypotheses he had going into the study and explained how they were confirmed or not confirmed through the study.   This was helpful because it gave the reader more insight into the design of the study. The author used this section to connect to other research and show similarities as well as highlighting the uniqueness in this study. It was helpful to have the author make these intentional connections to other studies. As a reader, it allowed you to make sense of how this study fits into the field of coaching. Ross(1992) also used the discussion section of the article to suggest possible future research.

One way to improve this study would be in the area of data collection, specifically the data collection on coaching. The self-administered questionnaire that was collected as data at the end of the study only gave information on the frequency that the history teachers interacted with personnel resources. The questionnaire did not reflect the quality of the coaching interactions or if the interactions had a direct connection to the student outcomes. In addition, the self–administered questionnaire was not only on the coaches that were assigned to them. There were layers of support that they gathered information on; the coach that they were assigned, use of other teachers in school, use of the coaching network and school administrator support. (Ross, 1992, p. 55) The data collection on how frequently the teachers interacted with the administrators and colleagues at their school didn’t seem to align with the driving research question, “ Who benefits from coaching?” (Ross, 1992)

I think the study would have been improved if the researcher collected data not only on the frequency of the interactions with coaches but also the type of interaction and the quality of the interaction with the coach. Ross (1992) explains that the coaching “relationship was less reciprocal in that the coaches were relative ‘experts’ in the history program and there was virtually no classroom observation component.” (p. 54) Thus, coaches’ feedback was given almost entirely on teacher report and through other artifacts such as lesson plans and student work. I believe in the power of in classroom coaching. Marzano and Simms (2013) research on Coaching Classroom Instruction describes how “traditional professional development usually leads to about a 10% implementation rate.” (p.6) The authors went on to reveal that “our experience has shown that when teacher receive an appropriate amount of support for professional learning, more than 90% embrace and implement programs that improve students’ experiences in the classroom.” (Marzano and Simms, 2013, p.6) I believe the appropriate amount of support for professional learning includes assessing what coaching support would be best for the teacher such as in class observation, modeling or team teaching. Therefore, I think it would have been beneficial to also collect data on the type and quality of the interaction.

I believe researching the impact coaches have on teacher effectiveness and student achievement is worthwhile and positively contributes to the field of education. I was an instructional coach for several years and had the opportunity to participate in a national coaching study with the American Productivity and Quality Center with a leading expert in coaching, facilitating the study. The goal of that study was to identify a direct link between the work of instructional coaches in supporting teachers and student achievement. After reading this article and participating in the APQC study, I am interested in continuing to research how access to coaches support teacher effectiveness and student achievement.

“Learning styles” and education in a controlled environment

Pashler, H. et al. (2009). “Learning styles: Concepts and evidence.” Psychological Science in the Public Interest, 9(3), 106-119.

People best learn in different ways.  This is a deceptively simple and interestingly familiar idea in modern educational research and curriculum design.  It’s also a concept accepted—or at least understood—by a wider general public, and fits nicely within the twenty-first century cultural (and technological) context that personalization easily available, expected and best.  But regardless of this wider acceptance, is there quantitative evidence to support the theory?  Pashler et al. (2009) set out to explore the current literature, historical context and quantitative support for what they term “learning styles.”  Through what historical context did this idea germinate?  What experimental methodology would best quantitatively prove its efficacy?  Has such research been performed in the current literature, and if so, what does the evidence prove?

It’s all Jungian

The authors begin by situating the idea of categorizing people into disparate “types”; this, they explain, draws from the work of Jung, whose research in psychology and psychoanalysis led to the creation of behavioral tests—like the Myers-Briggs—that perform much the same function as learning styles.  They categorize people into “supposedly disparate groups” based upon a set of distinct characteristics, which in turn explains something deeper about a person.  Although the authors do not regard these tests as objectively scientific, they do note that these tests have “some eternal and deep appeal” (pp. 107) with the general public.

The authors hold that this “deep appeal” partially explains what draws researchers, educators, learners and parents to the idea of learning styles.  Beyond being a method to feel like a larger and often cumbersome system is treating a learner uniquely, the authors write that learning styles can become straw men for underachievement:

“If a person or person’s child is not succeeding or excelling in school, it may be more comfortable for that person to think the educational system, not the person or the child himself or herself, is responsible” (pp. 108).

Even including the evidence presented, this is an unfair prognostication.  In their desire to explore the objective science of learning styles, the authors have shut down consideration of a slew of externally confounding factors, including socioeconomic stressors, racial background and cultural barriers, which all have a demonstrated influence upon classroom performance (Howard 2003; Liou 2009).  More than that, however, this passage reflects an underlying bias among the authors commentary—that a theory is lesser when it speaks to people emotionally.

What are learning styles really for?

However, when the authors break down the unspoken hypotheses that govern the idea of learning styles, they make an excellent point.  There are two very distinct issues at play:

  1. The idea that if an educator fails to consider the learning styles of his or her student, their instruction will be ineffective (or less effective).  The authors also consider what they term the reverse of this assumption: That “individualizing instruction to the learner’s style can allow people to achieve a better outcome” (pp. 108).
  2. What the authors term the meshing hypothesis, which assumes that students are always best “matched” with instructional methods that reflect their learning style.

These represent both disparate theories of curricular design and widely differing levels of analysis; whereas the first hypothesis presented above represents the assessment of learning styles as critical to the creation of a curriculum, the meshing hypothesis treats learning styles as more of a delivery method.  Most importantly, by confusing these two ideas in exploration of this theory, researchers overlook the possibility that one may prove true while the other does not.

One experimental methodology to rule them all

Before reviewing the current literature, the authors outline abstractly a simple, experimental methodology.  They identify this methodology as the truest way to “provide evidence” of the existence and efficacy of learning styles, and use it as a guideline to measure the quality of data in existing literature.  The requirements are listed below:

  1. Learners must be separated into groups reflective of their learning style; the authors suggest “putative visual learners” and “auditory learners” (pp. 109).
  2. Within their groups, learners are randomly assigned one of two instructional treatments.
  3. All subjects are assessed using the same instrument.

In order to prove the quantitative efficacy of learning styles, the results of this experiment must show a “crossover interaction”: That the most effective instructional method is different for each group.  The authors note that this interaction is visible regardless of mean ability; if Group A scores wildly higher on the final assessment than Group B, a crossover interaction can still be observed.

However, it seems that the authors are confounding their hypotheses in much the same way they identify the literature does; assessing the learning styles of a class and identifying which instructional tools will best speak to a particular learning style are completely different processes.  The latter includes interference from several factors, least of which is the assumption that all instructional methods are equally effective ways to explain the content at hand.  They also do not allow for these hypotheses to be proven independently true; by stating that the only acceptable outcome of this experiment is some magnitude of crossover interaction, they ignore confounding factors—the comparative strength of instructional methods to each other; that all learning styles are equally effective ways to explain the content; that students who identify either an audio or visual strength will respond to the content in the same way—and assume that either both hypotheses are true, or both or false.

But what are the tools for?

In their review of the associated literature, the authors denote only one article that supports the existence of learning styles and uses their outlined experimental method.  They conclude that

“although [this study is suggestive of an interaction of the type we have been looking for, the study has peculiar features that make us view it as providing only tenuous evidence” (pp. 112).

These tenuous features include omitting the mean scores of each group’s final assessment in the paper (instead matching learners with a control); that learner performance was measured by raters; and that the instructional treatments used vary significantly from those “more widely promoted” (pp. 112).

This lack of appropriate evidence, conclude the authors, demonstrates that the theory of learning styles is untested at best and nonexistent at worst.  However, the one point that the authors decline to discuss is why experimental methodology is best for “proving” this theory in the first place.  They assume that a controlled environment will provide truer or cleaner data without recognizing a singular truth of classroom education—there is no controlled environment.  Educators at the classroom level have no control over the previous education and content exposure of their learners; over the influences learners face outside of school; of the gender-based, racial or cultural experiences that shape a learner’s perception.  In such an environment, why would it matter to educators that one mode of assessing learning styles, or one instructional intervention, is statistically better than another?  That environment is far removed from the situation this theory is designed to elucidate.

The authors are unresponsive to their own biases, namely bridging the distance between an idea in theory and in practice.  They make the claim in their introduction that because learning styles are so untested, meager educational resources should not be focused on studying or including them in instructional design (pp. 105).  However, they fail to consider learning styles on a concrete level.  Is it truly more expensive to personalize a curriculum based on learning styles?  Does learner benefit need to be statistically significant in a controlled environment for it to be “worth” the effort?  Although the authors are in some ways critically reflexive of the unspoken hypotheses researchers assume in discussing learning styles, they are unaware of how their personal biases have shaded their commentary, which begs the question: To whom are the authors speaking?

Sources

Howard, T.C. (2003).  Culturally relevant pedagogy: Ingredients for critical teacher reflections. Theory into Practice, 42(3), 195-202.

Liou, D.D., Antrop-Gonzalez, R.A. & Cooper, R. (2009). Unveiling the promise of community cultural wealth to sustaining Latina/o students’ college-going networks. Educational Series, 45, 534-555.

 

Emotional Intelligence Competencies Can Be Developed

Pool, Lorraine Dacre, and Pamela Qualter. “Improving Emotional Intelligence and Emotional Self-efficacy through a Teaching Intervention for University Students.” Learning and Individual Differences 22.3 (2012): 306-12. Web.

 Many researchers argue that emotional intelligence plays a significant role in our attitudes, health, well-being, and professional success (P. 306). If this is true, why don’t k-12 schools and colleges create and implement curriculums that support the development of these skills? “As undergraduate students are gaining qualifications, knowledge and skills to prepare them for future lives in the world of work, it would make sense to ensure they are also equipped with knowledge and skills in relation to emotional functioning and with the confidence to enable them to act on these abilities” (P. 306) This study investigates whether it is possible to improve levels of emotional intelligence and emotional self-efficacy in university students through a teaching intervention (P. 307).

Pool et al. hypothesize that “both ability EI and ESE appear to be important predictors of academic success and graduate employability; theoretically, it should also be possible to improve them” (P. 307). Therefore, they designed a study to investigate whether or not it’s possible to improve levels of emotional intelligence and emotional self-efficacy. Using university students, the authors studied the impact that an eleven-week intervention class had on the participants’ EI and ESE competency levels.

The organization of the article is clear, coherent and logical. The article begins with an introduction, which is broken down into subsections. The subsections include the following headings: the importance of EI and ESE (laying the foundation for the importance of emotional intelligence and emotional self-efficacy), Designing EI/ESE teaching interventions (describing the interventions and assessments), and the present study (explaining who the participants were). Following the introduction was the methods section. Again, the authors broke it down into subsections: Participants, Measures, EI Intervention, and Procedure. Throughout this section, the authors provided detailed descriptions of the study. Next, the authors included the findings and discussions section. In the discussion section, the authors reflected on what they learned, acknowledging that there were some limitations of the study.

Prior to conducting this study, there was very little research regarding the ability to improve in EI and ESE. One study that the authors had investigated did not result in any improvements for the participants (this study consisted of a four-week intervention). Therefore, Pool et al. designed their intervention to take place over 11 weeks and found that the intervention resulted in significant participant growth in EI and ESE. The results implicate that people needed a longer period of learning and reflection in order to develop their emotional understanding abilities. These findings should have significant implications to our k-12 schools and universities and what we value as curriculums.

It was evident that the authors performed extensive research on EI and ESE as well as investigating the studies that had already been conducted. During the introduction and throughout the article, they included research for every variable within their study. When designing this particular study, the authors built on the work Nelis et al. (2009).

They began by designing the intervention (based on the Salovey and Mayer Four Branch Model of ability EI) for the study and identifying the pre and post assessments (EI (MSCEIT) and ESE (the Emotional Self Efficacy Scale). Their study included a larger sample size than the study conducted by Nelis et al. Additionally, they included both males and females from diverse academic concentrations. The study also included a control group.

The intervention class was offered to all students as an elective. The class was two hours per week and was eleven weeks in length. “Students completed the MSCEIT and ESES during the first class and were given a report and detailed one-to-one feedback of their results. They were asked to reflect on their results and incorporate these reflections in their first journal entry. The tests were repeated in the final class” (P. 308).

Throughout the class, the teachers implemented various activities including, “mini-lectures, video clips, case studies, group tasks and discussions, role play and an off-campus visit to an art gallery” (P.308). Students were asked to keep a reflective journal as well as respond to essay prompts and case studies.

The findings were positive. After the 11 weeks, participants showed growth in emotional self-efficacy and some aspects of emotional intelligence ability. When measured against the control group as well as their pre-assessments, the intervention group showed significant improvement.

While presenting the findings, the authors noted several limitations of the study. They stated, “one limitation of this study is the reliance on data gathered from a single source, the participants themselves. The use of multiple source methods, possibly including peer ratings of EI pre and post- intervention, would engender greater confidence in the findings” (P. 311). Another limitation included the teachers/tutors that taught the intervention. Because the teachers/tutors play an instrumental role, their EI and ESE need to be considered when making teacher selections.

“Previous research has suggested that higher levels of ability EI and ESE are desirable for a number of important reasons associated with work-related outcomes, academic achievement and graduate employability, but until now there have been few studies that demonstrate it is possible to increase levels of EI and ESE through teaching or training” (P.306). Through this study, we can conclude that it is possible to improve EI ability. The results of this study also show that it is possible to increase a person’s self-efficacy. These findings have significant implications for how we should be teaching and training our elementary, middle school, high school and college students.

 

References

Nelis, D., Quoidbach, J., Mikolajczak, M., & Hansenne, M. (2009). Increasing emotional intelligence: (How) is it possible? Personality and Individual Differences, 47, 36–41, doi:10.1016/j.paid2009.01.046.

 

Menu: Accelerated Learning – Best with Sides

Hodara, M., & Jaggars, S. S. (2014). An Examination of the Impact of Accelerating Community College Students’ Progression Through Developmental Education. The Journal of Higher Education, 85(2), 246–276. doi:10.1353/jhe.2014.0006

Menu:  Accelerated Learning – Best with sides

Most community colleges are feverishly trying to meet President Obama’s College Completion Challenge of increasing the number of students who complete a degree or another credential by 50% by the year 2020.  The task is large.  Fewer than 30% of community college students graduate within 6 years.  Even fewer who come in testing into below-college level (aka developmental) courses graduate within 6 years (College Completion Challenge Fact Sheet).  Whether you are neo-liberal and want students to contribute to the economy or an old-fashioned liberal and want equality for all people especially those who have been traditionally under-served, you may find this article examining accelerated learning in English and math classes at the City University of New York (CUNY) community colleges a worthy read.

There are several items on the menu of strategies to helping the under-prepared learner progress toward graduation – e.g. accelerated learning, contextualized learning, and problem-based learning.  This article focuses on accelerated learning – an approach in which the developmental sequence of courses an under-prepared student must take is shortened or sometimes offered concurrently with college-level courses.  The authors examined data from the six CUNY community colleges.  Students in the CUNY system are diverse:  “15% of students are Asian, 29% are black, 37% are Latino, and 19% are White; …48% are first-generation college students; and 46% have household incomes under $20,000” (City University of New York 2011 in Hodara & Jaggars 2014).  Overseeing all this diversity is a centralized developmental education testing policy with firm cut off scores at which students are placed in developmental education classes.  Each of the six colleges, though, was more or less free to design their own “menu” or developmental course sequence.

The authors found that the English and Math departments did not tend to consult with their sister departments in the district resulting in varying developmental sequences at each college.  Though to me that seems an oversight of administration, it provided the researchers with a ripe opportunity to compare length of developmental course sequences across the district through analysis of data without having to design an experiment.  In English, the researchers designated the treatment group as two colleges that had a single course of either of six or seven credits and compared those to four colleges with two classes in their sequence.  In Math, the control group was determined to be the five colleges that had three developmental math classes compared to the one college making up the treatment group that had only two developmental math courses.  Data were made available to the researchers over a 10 year period which allowed for some longitudinal following of students out of the community college into the CUNY universities.

The methodology is where things get complicated (and honestly over my head at this very early point in my doctoral program).  The researchers were concerned that just comparing outcomes of students in the short (treatment) vs long (control) term sequences would not account for confounding variables.   They noticed gender, race, and financial aid assistance differences right away and wanted to account for high school performance and students’ academic and professional goals.  By using a couple of logistic regression models, the researchers were able to compare like students to each other e.g. students with similar high school, region of birth, citizenship status, and college major.  This section has much more detail to it and I encourage readers with appropriate expertise to explore it further and those without to trust in the prestige of the Journal of Higher Education to believe the researchers did it right!

In general, students in the accelerated courses had better outcomes in their courses and in subsequent college courses than students in the control groups.  The results were not robust, though, and there was some difference between the English and the math sequences.  Students in the accelerated English sequences were more likely to get to college-level English and to accumulate credits and graduate.  However, students in the math sequence, though they passed college-level math, did not demonstrate long-term college success.  Academic policies within the institutions may contribute to that finding.  The authors report that passing college-level English is required for many other courses allowing those students to continue to make progress toward degree completion while passing college-level math does not necessarily lead to progress in other courses for non-STEM majors.

One aspect of this article’s contribution to the field is its interesting perspective on the role of community colleges in the field of higher education.  The authors suggest that community colleges may actually create barriers against achieving a college education – the opposite of their mission of increasing accessibility to higher education for those who might not have the option to attend university.  Since more first-generation and students of color start at community college that may inadvertently create a class system stratifying the middle and upper class white students into the universities and the students of color and low-income students to community college.

Another contribution is the authors’ acknowledgement in the discussion that the generally modest gains seen for the students in the accelerated classes can likely be improved by “more thoughtfully designed reforms incorporating stronger student supports” leading to “substantial increases in developmental students’ college-level credit accrual and graduation rates” (Hodara & Jaggars 2014).  Also, that as colleges continue to work on meeting the completion challenge with improved graduation rates that collaborative conversations around developmental education are more likely to happen thus building relationships and infusing diverse perspectives to provide a more nutritious meal that includes healthy sides in addition to the main dish of accelerated learning.

The influence this article has on me and my evolving line of inquiry is that, before taking this class, I was considering pursuing an interest in data analysis – partly because I was getting tired of feeling as though I wasn’t making the difference I had hoped to in the classroom.  Though I believe familiarity with data and careful analysis is crucial to effective teaching and effective programs, I find that the the analysis is not quite as tasty to me as the prospect of creating a colorful and nutritious “meal” with a variety of sides that complement the main dish.

 

References

Hodara, M., & Jaggars, S. S. (2014). An Examination of the Impact of Accelerating Community College Students’ Progression Through Developmental Education. The Journal of Higher Education, 85(2), 246–276. doi:10.1353/jhe.2014.0006

The College Completion Fact Sheet. American Association of Community Colleges.  Retrieved from http://www.aacc.nche.edu/About/completionchallenge/Documents/Completion_Toolkit_Complete.pdf

 

Cue the Zoom on Baltimore

Weist, M. D., Stiegler, K., Cox, J., Vaughan, C., Lindsey, M., & Ialongo, N. (2010). The Excellence in School Mental Health Initiative: Final Report (pp. 1–41).

 

Last week I looked at School Mental Health procedures in Australia. I was really jazzed to learn that the sort-of unformed ideas floating around in my head for the last year a) was already fleshed out and b) ACTUALLY EXISTED. And they have for over a decade! But I wanted to do more research about how we are addressing School Mental Health a little closer to home. Preliminary research suggested that Maryland is the place to be for this sort of thing, so that’s where I looked. If we were on YouTube, this is where we’d zoom out of Phoenix and pan over to the East Coast. Cue the Zoom on Baltimore:

In this report, completed by the University of Maryland, Center for School Mental Health, the authors reviewed data collected over two and a half years as two specific schools implemented the Excellence in School Mental Health Initiative (ESMHI). According to the report, “The overall goal of the project was to demonstrate the potential for a full continuum of environmental enhancement, stakeholder involvement and evidence-based mental health promotion and intervention integrated into two schools serving students in grades Kindergarten through 8th grade” (Weist et al., 2010).

I realize reading that might invoke a little of what I experienced earlier this week. Are you thinking, “Um, what? I think I know all of those words, but I when I read that sentence they just kind of glide right under me …” Here’s a picture that might help from (of course!) our friends over at MindMatters in Australia:

The Excellence in School Mental Health Initiative involved programs and staff to support all levels of the triangle:

  • They developed and ran parent groups to improve parent involvement and relationships with teachers
  • They implemented Paths to PAX, a universal/school-wide prevention program
  • They provided professional development to teachers to improve understanding of mental and behavioral health
  • They provided small group interventions for students struggling with behaviors, also known as “early intervention strategies”
  • And mental health clinicians held individual/group therapy sessions with students.

Many schools around the country have parts of this triangle in place, but it is rather rare to see the entire continuum fully supported. The purpose of the project was to find how well all these pieces fit together when they were all in place, including things like the parent involvement. The authors specifically noted that this study could not be used to make causal conclusions (i.e. saying that if your school does these things, it will get the same results), but could be used to make recommendations for the future or for other schools. The authors used descriptive, qualitative and quantitative data to both create and evaluate the initiative. This included a variety of demographic and school data, such as enrollment, attendance, discipline, etc.; surveys and interviews with students, staff, parents, and community members; treatment data collected by mental health providers; case studies; and focus groups.

The authors found that there were many gains, including better parent-school relationships, students receiving more mental health care, and improved teaching strategies in regards to behavior in the classrooms. They also found that while there are many reasons  to give mental health care within the school setting, there are a lot of things that can get in the way. Things like high teacher turnover rates, teachers being overwhelmed with everything else they are expected to do, and variable support from administration all impact the implementation and effectiveness of such programs. BUT!!! They did find that having significant supports – like great funding (described below), buy-in from higher levels, and University support – made it possible to face challenges head-on and overcome many barriers.

I was simultaneously disappointed and pleased with the results. This initiative had funding from a lot of different places, including the City of Baltimore, the University of Maryland, the Baltimore School District, and several public and private organizations. It seems like it would be a dream! But even with all that they ran into many of the same problems I have experienced in trying to carry out different layers of the triangle above. For example, I’ve been at a few schools that buy new programs to address the widest level of the triangle (Whole School Environment). Everyone is gung-ho for the first few weeks, but implementation falls off after a month or two. I would have thought that with so many resources and support staff promoting this initiative for 2.5 years, there would have been more buy-in and compliance from staff. But in reality, they dealt with the same problems I have seen, and for the same reasons: too many other things to focus on, too overwhelmed, high teacher mobility, lack of administrative support, and the program not meeting perceived expectations. While this was disappointing, it was also refreshing to know that simply implementing a new program or jumping on the next bandwagon of a particular intervention is not going to change the school culture. To really make a lasting impact on the school culture, it needs to happen slowly and over a long time.

I did have more criticisms of this report than I did last week’s. For the most part, the report was well-written and easy to follow. They used language that was easy for someone in the educational field to understand, and they gave so much data to support their conclusions. The difficulty was that there was So. Much. Data! And it was all in paragraph form, which meant it was nearly impossible to really get a good handle on it. They were using data from 2 schools and gave in-depth analysis of each type of data from each school. If it had been presented with visuals, like graphs and charts, it would have been so much easier to grasp. Throughout the report they did reference graphs and charts in the appendices… but there weren’t actually any appendices at the end. And the Appendices link provided in the Table of Contents was no longer active. I think I would have been better able to make connections to my own school if I could have seen the data differently.

As I am collecting these articles and reports, I am building this dream world in my head. I want these things in Phoenix, in Arizona. I want to be a part of building them, of making them actually happen. I want to see students in their classrooms more because they’re getting in trouble less. I want to see students that have a better quality of life because they understand what to do in the classroom or in social situations and they have the skills to do it. I want to see teachers who are less depressed and stressed out. I want to be in classrooms where teachers are able to focus on the things that made them want to be a teacher, not all the extraneous junk that keeps getting piled on their plates. (OK, mental health initiatives probably won’t actually affect that, but hey – it’s my dream world, I can make it look however I want!)

I really do want to see some of these initiatives in play, though, to see what they look like when they’re actually happening. Does it look, feel, and sound like any other school? Are culture changes only noticeable if you’re an insider, privy to all the inner-workings of a school? Or is it tangible? Noticeable to everyone who walks in? Do students and teachers enjoy being there because of the positive atmosphere? Or is it still a school, where kids complain about homework and teachers count down the days to summer break? I don’t have the answer yet, but I am doing what I can to find out!

Weist, M. D., Stiegler, K., Cox, J., Vaughan, C., Lindsey, M., & Ialongo, N. (2010). The Excellence in School Mental Health Initiative: Final Report (pp. 1–41).

Facebook as Professional Development?

Rutherford, C. (2010). Facebook as a source of informal teacher professional development. In Education. Retrieved from http://ineducation.ca/index.php/ineducation/article/view/76/512.

 

For professional development (PD) to be considered effective, it must meet four criteria. These criteria characterize PD as: 1) Sustained, on-going, and intensive; 2) Practical and directly related to local classroom practice and student learning; 3) Collaborative and involving the sharing of knowledge, and; 4) Participant driven and constructivist in nature (Rutherford, 2010, p.62). In the journal article, Facebook as a Source of Informal Teacher Professional Development, author Camille Rutherford seeks to ascertain whether discussions that happen between teachers and other educational professionals on social media can be considered professional development and if such informal conversations meet the above four criteria for effective PD.

Rutherford (2010) begins her article by giving a historical context as to the seven different categories that form the knowledge base for teaching; such categorization serves to, “simplify the otherwise outrageously complex activity of teaching” (Rutherford, 2010, p. 61). These seven categories are not meant to be taken as a reduction of the teaching profession to a list of criteria, but rather form contextual categories that help synthesize the diverse areas that professional development can be offered. These seven categories, as first defined by Shulman (1987) are: 1) general pedagogical knowledge; 2) curriculum knowledge; 3) pedagogical content knowledge; 4) knowledge of learners and their characteristics; 5) knowledge of educational contexts [e.g. different styles of education], and; 7) knowledge of educational ends, purposes, and values [e.g. historical perspectives] (Shulman, 1987, p.7).

In order to determine whether teachers’ conversations on social media met the criteria to be considered effective PD, Rutherford monitored the postings on a Facebook group for teachers in Ontario, Canada. She cites that Facebook has the perception of being an, “adolescent playground ripe with juvenile gossip and social bullying,” however, despite this reality, she notes that Facebook has become a space for professionals who seek opportunities to network and exchange ideas and resources to gather (Rutherford, 2010).  In her monitoring of the Ontario Teachers – Resource and Idea Sharing group, which, at the time (2010), had more than 8,000 members, she used both qualitative and quantitative examinations of the discussion topics.

Over the course of the 2007-08 school year, she found that 278 new and unique topics of discussion were created, generating 1,867 posts from 384 different Facebook users (Rutherford, 2010). Any post that didn’t garner more than 2 responses, were excluded from the study, as, without another’s input, it cannot be considered a discussion. Any post that was also deemed too sales-y, or geared toward promoting an item, product, or service was also excluded from the study. Next, two independent “coders” went through the posts and categorized them into one (or more) of the seven different categories of teacher knowledge (see above). The coders then eliminated any posts that were considered too sales-y, or geared toward promoting an item, product, or service for fee (Rutherford, 2010).

The study found that the majority of the posts were related to Pedagogical Content Knowledge (strategies, tips, and tricks to help out in the classroom), representing just more than a quarter of all total posts (Rutherford, 2010). The next category was a surprising one, as it didn’t fit into any of the categories in Shulman’s conceptual framework for teacher knowledge, so Rutherford created a new category: Employment (opportunities and/or related questions). Posts categorized in this area made up the 22.5% of all posts analyzed (Rutherford, 2010). The final category, representing greater than 10% of total posts (19.8%), included discussions of Curriculum Knowledge. All other categories were comprised of less than 10% of total posts (Rutherford, 2010).

One of the essential features of effective professional development is that it be collaborative, on-going, practical, and participant driven. Rutherford (2010) found that the average number of months that users were actively engaged in discussion was less than 2 months (1.79 months) and the average user made only 4.2 posts during that span. These data suggest that discussions happening on Facebook, while certainly constructivist, collaborative, and participant-driven in nature, were lacking the essential “on-going” feature necessary for effective professional development.

In my situational context, as a professional development provider to schools across the state, we’ve tried to integrate more online components into our professional development portfolio offerings, only to find that teachers generally have not utilized them to the extent we were hoping. I see this evidenced in my own practice as well. When I reflect on my own professional development, both as a teacher, and in my current role as a trainer, I’ve been asked to “continue the conversation” on Edmodo, a social media site similar in platform to Facebook, but dedicated to educators and education. I found the steps of creating a username and password, confirming my email, setting up a profile, requesting access to the page, and waiting to be granted access as very cumbersome steps that did streamline the continuation of learning. In my writing of this blog post, I went back to those pages, only to find that there had only been one post in the 8 months the group had been around.

In my own learning experiences, like my Master’s degree, for example, I found the process of online modules, classes, and activities to be an ineffective medium to facilitate true learning, as the “flow” of a conversation was very unnatural and not conducive to insightful reflections and discussions on practice and pedagogy. While I’m sure that some people may enjoy and find value in the convenience of the online style to meet their varying schedules and time constraints, there is, however, something incredibly valuable for me about having that in-person, face-to-face interactions when learning from and with other people. It becomes much easier, in person, to hear the other person’s tone, read their body language, and ask follow up questions in a meaningful and timely manner, things that are lost through virtual communication. Because of these sentiments, I generally agree with Rutherford’s assessment, when she said, “Facebook teacher groups and similar forms of social media should be seen as an effective supplement [emphasis added] to traditional teacher professional development” (Rutherford, 2010, p.69). The idea that online modules could ever replace in-person professional development is not one I could support, but it certainly has a role to play as a free, low-risk, and convenient medium for teachers to collaborate and learn from one another.

 

Additional works cited:

Shulman, L. S. (1987). Knowledge and teaching: foundations of the new reform. Harvard Educational Review, 57(1), 1-22

Music and Technology

Carruthers, G. (2009). Engaging music and media: Technology as a universal language. Research & Issues in Music Education, 7(1), 1–9. Retrieved from http://www.stthomas.edu/rimeonline/vol7/carruthers.htm

 

This week I read “Engaging Music and Media: Technology as a Universal Language.” (Carruthers, 2009) The article is about the role of music and technology in education and how they might play a role together. The article doesn’t offer new research, but it does synthesize others’ research.

The first discussion is about the roles of music, within education and how they might affect each other. Carruthers states that music often plays a secondary role in education. Meaning, that we don’t teach music as part of our curriculum because music is good, in and of itself, we have music within our curriculum because it supports something else. As a music teacher, I often find myself saying “This directly supports you” to other content teachers. You don’t often hear a math teacher justifying why the kids need to learn math. There is an array of reasons why music is valuable on its own legs. It doesn’t need to be supporting anything else.

After reading the article, I recognized that I had used the same type of reasoning as the supporters of Flores v. Arizona. As discussed in “Keeping up the Good Fight: the said and unsaid in Flores V. Arizona.” The supporters had many reasons why the ELL funding in Arizona should be awarded to the schools. The findings, however, showed the reasons from the supporting side fell under the idea of, ‘you should support this because you’ll get this out of it’ mentality. (Thomas, Risri Aletheiani, Carlson, & Ewbank, 2014)With that being said, great teachers integrate all areas into their content. Students need to see how everything is interrelated. Often times children are taught in compartments: math in math class, science in science class…etc, but our lives do not work this way.

Music has, what Caruthers calls, a division of labor. In music, this is the composer, performer and listener; each has their separate job and people rarely cross over. With the addition of technology, this isn’t necessarily the case. My own children compose music with special applications that do not require them to read music. Anyone with the right software can do all three. I see this as one of the biggest impacts technology has had on music. In the past, if one didn’t read music, composing to share with other was rather difficult. Now with software and media- sharing, this becomes relatively easy.

In order to look at the various ways technology impacts us, Caruthers defines technology as anything “from the wheel” to “a personal computer.” This immediately caught me off guard. Defining what is technology never occurred to me. I simply thought of technology as laptops, computers and electronic devices and any software to go along with it, but after reading how Caruthers is approaching technology, I may have to be more specific in what I’m viewing as technology within my research. The ways technology can have an impact, according to Caruthers, can be broken into four parts, technology that: 1. makes things easier to do than it was before, 2. does things better than before, 3. allows us to do things we couldn’t do before and 4. makes us think differently. Again, I had to consider the future of my research. At what level of impact am I going to be assessing. For instance, making it easier to do things than it was before, such as multiplication practice, may not have as big of an impact on student achievement as something that makes the student think in a different way.

The article was more thought provoking than I expected it to be.  Carruthers was clear from the beginning, he was reviewing previous research and that the paper would not answer many of the questions. The purpose of the paper is to create discussion and it proved to do just that. It caused me to look at the research I’m heading into and the basics of how I will approach it. I am dealing with so many more layers than I had previously thought. Carruthers poses, “It is incumbent upon us as educators not only to evaluate the uses of technology – to extol its virtues and denounce its failings – but also to explore deeply how it encourages or causes us to think differently about the world around us.” In my research, I will have to decide if I’m going to look at the level of technology that creates the deepest learning or do I not even take it into consideration.  Do I continue looking at the impact of music with technology on achievement or solely at the impact technology? If I research the impact of music and technology together, does the depth of learning within the music matter in the research? For instance, composing is a deeper depth of knowledge than identifying notes. How does one take this into consideration?  If my research does show an impact on student achievement, is it necessary or valuable to determine if the act of utilizing technology is creating more engagement or is the technology deepening the students’ understanding? Either one could impact student achievement; is there a way to tell which it is? How do I approach the research in a manner that will include my community and their views? In fact, can I even account for the ways technology and especially music has on the community?

Carruthes said it well, “Many of the benefits of music study, some of which are imbedded in the art form itself, are intended by teachers and curriculum planners while others are not” I suspect, that this is the case in technology as well. Unfortunately, it adds another question for me. How do I consider this in my research?

Overall, the article was well written and professional. It was organized in a logical way and he was very clear that he was presenting theories and that, as a literature review, was creating more questions than could be answered in this one piece. His ideas are insightful and have definitely given me pause. I have a lot to consider as I dive deeper into my research.

 

 

Thomas, M. H., Risri Aletheiani, D., Carlson, D. L., & Ewbank, A. D. (2014). “Keeping up the good fight”: the said and unsaid in Flroes v. Arizona. Policy Futures in Education, 12(2), 242–261. doi:10.2304/pfie.2014.12.2.242

Education Starts in the Community

Grothaus, T., & Cole, R. (2010). Meeting the Challenges Together: School Counselors Collaborating With Students and Families With Low Income Tim Grothaus and Rebekah Cole Old Dominion University. In the article, Meeting the Challenges Together: School Counselors Collaborating with Students and Families with Low Income by Tim Grothaus and Rebekah Cole, examine how school counselors work with members of the community to raise student achievement and provide them more opportunity and availability to resources. In addition, school counselors are exploring ways of informing community members. Grothaus and Cole’s purpose for their study is given right up front; “culturally responsive school counselor advocacy and collaboration with low income students and their families is essential to successfully address the pernicious achievement and access gaps pervasive in U.S. schools” (Grothaus & Cole, 2010,p.3). This article was very well organized.  Ideas were clustered and headings were used throughout to make the argument very easy to follow.   I chose this article for its insights that parallel my topic of inquiry, of “school counselor roles in challenging biases, educating stakeholders, and engaging in advocacy for these students and families” (Grothaus & Cole, 2010, p.12). The theoretical framework for the research is set around “the prevalence of youth from families with low income and the distressing inequities in the educational data associated with family income level merits attention.” (Grothaus & Cole, 2010,p.3) I found the research was vague about how data was collected.  Much of the data used was referenced through other sources. However, it seemed that there was relevant data that was not included that would in my view support their analysis. For example, how many of the youth are first generation immigrants, how new are they to the school and how often do they move?  Answers to these are questions would help to transition the reader to the broad findings such as, “Data indicate that low income students have not been afforded equitable educational experiences”. (Grothaus & Cole, 2010, p.3)   Is this data speaking for all low-income students, and what if one compares data for equitable educational experience with geographic location?  For example, a student who lives near Washington D.C. will have better opportunities and exposure for learning about government than a student from Texas or Arizona.  Likewise, students in Arizona and Texas will have more opportunities on topics such as SB1070 and boarder security.  Are those geographic experience factors calculated as equitable educational experiences?  It seems difficult to accurately support the data that says “students from families with low income often lack the resources and teacher expertise of more affluent schools.” (Grothaus & Cole, 2010,p.3)  Is teacher expertise measured at these low income areas based on student achievement?  Teacher evaluation is a point of controversy that is still being debated.  In the interest of duplicating the findings, I would like to see the article define things like educational experiences and teacher expertise. One example could be where researchers pull teachers from an affluent locale and a low income locale and let them trade for a period of 4 to 5 years, then return with data that shows the disparity in teacher expertise.  The research that shows “low-come families tend to be less involved in their children’s academic lives than middle-class families” (Grothaus & Cole, 2010, p.4) is data is more convincing for the differences in student achievement. “Students who are eligible for free lunches are about two years of learning behind the average student of the same age from non-eligible families”. (Grothaus & Cole, 2010, p.4)  What troubles me is the focus on what I see as supporting details to the core of a bigger problem.  I can’t see how to separate the school from the problems of the rest of the community. The authors offer all of these facts that basically amount to a list of citations strung together and then turn the corner with an example of a school that is beating the odds. However, many schools are “proving that race and poverty are not destiny”. (Grothaus & Cole, 2010, p.6) “One seemingly robust factor in many of the success stories is school and family collaboration”. (Grothaus & Cole, 2010, p.6) This is the data that is of the most interest to my own topic of inquiry. “Research over the last few decades confirms that family involvement in their children’s education enhances the potential for students’ success- specifically with high achievement, increased rates of attendance, fewer disciplinary referrals, better homework completion, more course credits accumulated, and increased likelihood of high school graduation and college attendance.” (Grothaus & Cole, 2010, p.6) The larger community is a combination of business, government, civic services, like police, fire, power, water and sewer, and schools.  All of these elements of the larger community move in and are created to serve the people of the community.  Conversely, the larger community depends on the people of the community to buy and spend their money for its support.  It’s simple economics.  Low income is linked to people with little education working in low paying jobs which translates to longer hours and results in the larger community receives less money. The problem in low-income communities is that there is less money.  The solution is to increase the earning potential of the people.  The implied and obvious choice to quickly increase earning potential is through education.  Therefore, it benefits the community and the people of the community if they invest in ways of supporting education.  “A number of schools and their boards are arriving at the same conclusion- that collaboration is an avenue through which students’ needs may be met and achievement promoted.” (Hands, 2005, p.64) The question that requires further study is how the community and the people in it can support education?  A good starting point is by “identifying goals, defining the focus of the partnerships, and selecting potential community partners” (Hands, 2005, p.67). Grothaus and Cole point out that school and family collaboration needs to also “examine school personnel biases about families with low income and challenging colleagues to change their views and practices.” (2010, p.7) I don’t think focusing on teachers’ biases effectively contributes to establishing collaboration.  “Unnecessarily alienating school personnel through strident advocacy may be less effective than respectfully but firmly challenging biases and building coalitions for change based on shared principles.” (Grothaus & Coles, 2010, p.8) “School – family partnerships benefit schools and families in a variety of ways, including families’ feelings of acceptance into the school community” (Grothaus & Cole, 2010, p.10). Collaboration will build up channels of communication to ensure the school and the community are “empowered and equipped with the resources they need to support their children.” (Grothaus & Cole, 2010, p.10). Just as Grothaus and Cole found in their conclusion, “School counselors can advocate for these partnerships via challenging bias, training school personnel, engaging in outreach to families, conducting research to ascertain effective practices, and promoting the benefits involved in collaborative problem solving and accessing student and family strengths.” (Grothaus & Cole, 2010, p.13)  The solution should start with the community and partner up with the schools in the process. References: Hands, C. (2005). It ’ s Who You Know and What You Know : Process of Creating Partnerships Between Schools and Communities, 63–84.

Barriers to Introducing System Dynamics in K-12 STEM Curriculum

Skaza H., Crippen, K. J., & Carroll, K. R. (2013). Teachers’ barriers to introducing system dynamics in K-12 STEM curriculum. System Dynamics Review, 29(3), 157-169.

Science, technology, engineering, and math (STEM) education is required in order to prepare students for fast-paced 21st century careers but best STEM teaching practices have yet to be fully developed. One technique currently being studied is system dynamic modeling that “provides a valuable means for helping students think about complex problems” (Skaza, Crippen, & Carroll, 2013, p. 158). System dynamics offers a means of thinking and modeling that allows students to begin making connections between variables. If system dynamics modeling gives students greater access to STEM curriculum, I believe we need to discover the barriers of program implementation and actively begin breaking them down.

Skaza, Crippen, and Carroll analyzed current barriers to introducing system dynamics into K-12 STEM curriculum in their 2013 article. The authors analyze three research questions by means of a mixed-method approach. The questions are as follows;

  1. How are teachers currently using system dynamics simulations and stock and flow models that were already a part of their adopted curriculum?
  2. For teachers who are not using the simulations, what barriers persist to their classroom implementation?
  3. What is the level of teachers’ understanding of the system dynamics stock and flow modeling language and how might that be influencing the classroom use of system dynamics tools? (p. 158)

The organization of the article is clear, allowing the reader to easily progress through the study of ‘system dynamics.’ Structurally, the article begins with an introduction, which includes the main research questions addressed in the remaining sections. After the introduction there is a review of related literature, allowing the reader to get a better view of previous findings by other scholars. The literature review contains relevant topics that allow for a broader examination of the research topic.  Next, the authors thoroughly cover the context for the investigation, methods used, results, discussions section, and final remarks and future research. As a whole, the organization of the article is all-inclusive and is very coherent.

Skaza et al. (2013) addressed a concept that has previously been studied by other educational researchers. According to the authors, a “larger base of empirical research is needed” (p. 159) in regards to system dynamics in order to begin fully utilizing them in most K-12 classrooms. Overall, the study found that only 2.8% of the educators completed the curriculum, which is equivalent to two participants. After this discovery, the researchers analyzed the major barriers such as lack of access to the technology, low teacher efficacy, and not enough professional development support. Outcomes for the study will allow for future research to address the major barriers discussed.

Within the article, Skaza et al. (2013) analyze systems thinking and system dynamics modeling as means for giving improved access to STEM curriculum, particularly to minority students. Systems thinking and system dynamics modeling “is consistent with recent calls for educational reform that focuses on active learning strategies, teaching for transfer to new problems, as well as intending for creativity and innovation as key outcomes” (Skaza et al., 2013, p. 157). Thus, this study is relevant to the overall consensus of the United States’ push towards effective STEM education.

In regards to theoretical frameworks, “the theoretical framework for this revision includes system and system models as crosscutting concepts and as a component of Scientific and Engineering Practice” (Skaza et al., 2013, p. 157).  As a whole the authors stay true to the framework making the article cohesive and appropriate.

Within the methods section, the authors discuss the mixed-method approach to data collection that is used in the quest to answer the three research questions. The “research method involved a single-group, mixed-method (quantitative-qualitative) design consisting of two phases: a survey followed by a focus group” (Skaza et al., 2013, p. 160). Participants for this study were selected from 40 high schools and consisted of 160 teachers, while the focus group was made up of four participants. In summary, the survey consisted of 17 questions containing both qualitative and quantitative measures. Also, the focus group contributed valuable support for the survey findings, which could be made stronger by increasing the number of focus group participants.

The researchers analyzed the surveys by looking at both qualitative and quantitative data, while using the focus group information to add depth to the survey findings. If another researcher wanted to replicate the analysis piece of this research, there is adequate information to do so. The analysis section fully describes the steps taken by the researcher and allows for replication due to the specifics of how data was analyzed in both the surveys and focus group. Overall, the researchers determined the number of participants who actually implemented the system dynamics concept into their classroom and if teachers failed to implement, the researchers worked to uncover the barriers to implementation.

As far as the findings are concerned, they are based on a thorough understanding of the data. By this I mean that the researchers analyzed the survey information, gained knowledge, and then used the focus group to either confirm or deny these findings. Also, there were multiple questions within each category on the survey helping gain more accurate information. For example, the survey asked teachers to provide proof of understanding the concepts by means of essay answers. So, if a teacher said that unavailable technology was their barrier yet they were unable to describe a science concept, the researchers could conclude that teacher efficacy is also an issue. The researchers discovered that the major barriers to using system modeling in the classroom is technology, yet the focus group and survey essay answers told a different story of potential teacher efficacy problems. Thus, I believe that the barriers are accurately captured, which can in turn lead to potential new research or action.

As an educator, I have experienced the push towards technology use in the classrooms. I believe that this thrust is necessary and important towards the growth of our students and the necessity to bring students into the 21st century. Our goal is to help students use technology to problem solve and work towards higher understandings but what happens when teachers don’t fully understand how to integrate technology into the classroom? Many educators that I have encountered feel uneasy about technology, thus do not make an effort to use it to enhance the learning environment. With this being said, our first move towards incorporating system dynamics modeling into the classroom, in order to enhance STEM understandings, is ensuring that all of our educators and future educators are technologically competent.

 

 

References

Skaza H., Crippen, K. J., & Carroll, K. R. (2013). Teachers’ barriers to introducing system dynamics in K-12 STEM curriculum. System Dynamics Review, 29(3), 157-169.

Promoting success in online education… but, what is success?

Harrell II, I. L. (2008). Increasing the success of online students. Inquiry, 13(1), 36–44. Retrieved from http://www.vccaedu.org/inquiry/inquiry-spring-2008/1-13-Harrell.html

A concise, if not relatively simplistic piece, “Increasing the success of online students” highlights three components that impact student retention in online or distance education programs (2008).  These are student readiness, orientation, and support.  Harrell notes that online or distance education research also demonstrates the importance of “instructor preparation and support” and “course structure” for online student success, but the author sets these aside for this discussion.  In part because online education programs suffer from very high attrition rates, the author focuses on retention as the primary indicator of online student success.

 

Whereas other studies on online learner success, particularly prior to the extensive penetration of the internet in the distance education domain (Roblyer, Davis, Mills, Marshall, & Pape, 2008), focus on either learner characteristics or the learning environment, Harrell does not make this distinction.  Corroborating this approach, through an extensive research effort culminating in a readiness instrument for [prospective] online learners (the Educational Success Prediction Instrument [V2]),  Roblyer, Davis, Mills, Marshall, & Pape (2008) state that their “findings indicate that a combination of student factors and learning conditions can predict success” of online learners, “though predicting success is much easier than predicting failure” (99).  The orientation of the piece is higher education – the author is an assistant professor and the coordinator for student affairs at J. Sargeant Reynolds Community College, presumably writing from his own context; however the references used and the message is more broadly applicable. While Harrell’s piece is not revelatory, it reinforces certain best practices, espoused by related studies, relevant for online learning program development.

 

“Positive impact on online student success”

When an individual embarks on anything new, preparation for their new environment, expectations, relationships, and skills required is integral to his/her capacity to endure what’s ahead positively and productively.  Harrell recommends assessing student readiness for online learning prior to a student beginning coursework, then using this information to either counsel students against the online option or to build an individualized support strategy for each student, based upon their apparent strengths and weaknesses. An orientation should follow, possibly in the form of entire course (as exemplified by Beyrer (2010) and the Online Student Success online education orientation course).  The author favors online (vs. face-to-face) orientations, to get students navigating the technologies and program expectations in the realm and in ways that “mimic” their educational program immediately, before coursework becomes distracted by the student’s [inevitable] technical struggles.  Student technical support that is as accessible and available as the “anytime, anywhere” coursework is absolutely necessary.  The useful suggestion is made to leverage the skills of student workers and others within and beyond the school community to optimize support in this way (without requiring financial and human resources to which many schools lack access).

 

Enabling students to feel and cultivate their own sense of community and belonging is critically important – to student’s individual achievement and to the success of the program. The author cites studies that have recorded students’ reasons for withdrawal as very often being a sense of isolation, or not feeling a part of something (bigger than themselves).  A community among online students is relevant for facilitating a peer culture with mutual engagement, contributing to the student’s school support system, and creating opportunities for interdisciplinary collaboration and shared “real world” experiences.  Tools to communicate regularly and without pretense, e.g. instant messaging and social networking, and generating online spaces, e.g. “virtual lounges,” for students to connect on topics academic or of personal interest can support the development of communities.  “The more students integrate into the formal and informal social and academic culture of the institution, the more successful they will be” (Harrell, 2008).  In addition to these important features of an online program that supports student success Harrell focuses on, Roblyer et al (2008)emphasize that “initial active involvement in online courses predicts success. That is, students who are active in the first few weeks of the class are more likely to be successful in the course; dropout behavior is most likely to occur in the early weeks of the course” (106).

 

The development of a “sense of community” is different from developing a community of practice.  “Communities of practice [as defined by Etienne Wenger-Trayner] are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly” (http://wenger-trayner.com/theory/). Perhaps the more inclusive (for both the participants and the institution) and ultimately impactful approach is to develop a community of practice among online learners.

 

Peers – in multiage groups spanning grade levels –might organize an action research agenda around a theme or specific research question, as an example constructing empowered communities of practice among online student populations.  They could do this on a semester, annual, or episodic basis, but continual throughout their postsecondary career.  Each student would have a position in the community, defined in part by their experience and budding expertise (or competences as Wenger [2000] discusses this).  The shared research agenda, with each individual engaged in and accountable for some aspect of the process, as well as coordinated action steps to maintain the group’s “alignment” to the co-constructed vision and mission, the students would gain invaluable experiences navigating the worlds in their research purview, collaborating with each other, and working toward a common purpose (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013; Wenger, 2000).  The community of practice would serve students’ development in ways applicable to and that transcend academia – arguably supporting their “success”.  Moreover, the likelihood of their retention would be significantly improved.

 

“Success”=Retention?

Harrell uses “success” and “retention” nearly interchangeably.  Is student success no more than an enrollment number?  Many days, given considerable budget constraints and the overly convoluted ADM calculations process (average daily minimum [ADM], which refers to the compensation charter schools receive per pupil) for online schools in the state of Arizona, retention feels so crucial to institutional “success” (read: viability and sustainability) that it doesn’t seem a stretch to conceptualize student success in the stark terms of attendance vs. withdrawal.  However, the effort and heart involved in establishing a new school is likely not just for the warm bodies and smiling faces (hidden behind various screens).  The purpose is more plausibly to provide a better, alternative, or altogether unique educational opportunity to some subset of students.  Defining success in this narrow way unquestionably narrows the exploratory purview: if the investigator is interested only in conditions and learner characteristics that lend themselves to a student’s staying in or leaving a school, will the data capture include relevant life circumstances (e.g. having a baby, needing to care for an ailing family member, having to prioritize income generation, or an onset of a mental disability)?  In other words, will this highly limited conceptualization of success skew the perspective on online educational program quality?

 

On a personal note, I had a meeting this week with a student who “dropped out” of our brick-and-mortar school in her eleventh grade year, due to a sudden emergence of debilitating expressions of a mental condition.  This would be a “failure” – on the student’s part and on our part – with respect to Harrell’s use of “success”.  However, she returned.  Several months later, she feels, once again, capable of course work.  Success!  (For now.)  A more comprehensive investigation would seek an understanding of: what kept the family connected to our school; why they felt they could trust us during her leave and now upon her return to care for her appropriately; and, what sorts of support they have received from us that kept their family loyal.

Roblyer et al (2008) suggest that “virtual schools … must come to gauge their success not only in terms of numbers of students served and courses offered but also in terms of how much they provide access and support to students most in need of an educational edge (107).”  The intent of this post is not to interrogate the author’s use of “success,” but perhaps that inquiry will emerge in the future. What is most interesting about this examination is what it signifies for program development: the benchmarks for programmatic evaluation and metrics of success are, by necessity, predicated upon the institutional imagining of Success – at the student level and at the organizational level.  When we speak of “excellence” in our contexts and consider an action research program to improve upon some aspect of or to, more generally, strive toward excellence, it is unlikely that retention emerges as the lead indicator.

 

Bautista, Mark A.; Bertrand, Melanie; Morrell, Ernest; Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the Council of Youth Research. Teachers College Record. Retrieved May 30, 2014, from http://www.tcrecord.org.ezproxy1.lib.asu.edu/library/content.asp?contentid=17142

Beyrer, G. M. D. (2010). Online student success: Making a difference. MERLOT Journal of Online Learning and Teaching, 6(1). Retrieved from http://jolt.merlot.org/vol6no1/beyrer_0310.htm

Roblyer, M. D., Davis, L., Mills, S. C., Marshall, J., & Pape, L. (2008). Toward practical procedures for predicting and promoting success in virtual school students. American Journal of Distance Education, 22(2), 90–109. doi:10.1080/08923640802039040

Wenger, E. (2000). Communities of practice and social learning systems. Organization, 7(2), 225–246. doi:10.1177/135050840072002

Wenger-Trayner, E. (n.d.). Communities of practice: a brief introduction. Retrieved June 05, 2014, from http://wenger-trayner.com/theory/

PAR by Proxy: Participatory Action Research, Emphasis on the “Research” (Not So Much on the “Participatory” or the “Action”)

(Ostmeyer, K., & Scarpa, A. (2012). Examining school-based social skills program needs and barriers  for students with high-functioning autism spectrum disorders using participatory action research. Psychology in the Schools, 49(10),p.932-941.doi: 10.1002/pits.21646

Ever since learning about participatory action research (PAR), and particularly the work of the Council of Youth Research at UCLA (check out their work here), I’ve been obsessed with the idea. The term itself–participatory action research–doesn’t sound quite as exciting, as novel, as potentially revolutionary, as it is. For the uninitiated, the goal of PAR is to “develop interventions with the direct input of stakeholders” (Ostmeyer and Scarpa, 2012, p. 932). Although PAR-based studies often are designed according to traditional Western evidence-based, if-it-can’t-be-measured-it-doesn’t-exist values, the goals of PAR can intersect with the goals of indigenous research methodologies, in that the PAR studies “take into account the ideas and perceptions of the population directly affected by the problem” (Ostmeyer and Scarpa, 2012, p. 932), just as advocates for indigenous research methods embrace a form of research where “what is acceptable and not acceptable research is determined and defined from within the community” (Denzin and Lincoln, 2008, p. 6) and “they [indigenous people], not Western scholars, have first access to research findings and control over distribution of knowledge” (Denzin and Lincoln, 2008, p. 6). The Council of Youth Research, for example, empowers “youth of color attending city high schools [to] become lead agents in the process of research” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 2). In this model, the people with the most at stake in these critical questions are actively involved in the process of research–and they are the immediate beneficiaries of its findings.

Man. This is exciting stuff. The thrill of it can’t be stated more succinctly than it was by Dr. Melanie Bertrand, who worked with the Council of Youth Research at UCLA and has written and published about its work, upon her visit to our class last night: “The most amazing thing about [it] is that the students who are most marginalized … have the most wisdom to share” (M. Bertrand, personal communication, June 5, 2014).

If I could include sound effects in this blog, right about now I’d embed that old record-scratch sound effect. Wait. Stop. This is exciting stuff, but think about the world I live and work in. The students I teach at my exclusive, prestigious, expensive suburban private school are not the underserved, marginalized urban youth involved with the Council of Youth Research. What place could PAR or, thinking more broadly, indigenous research methodologies have in my world? But then I come back to Dr. Bertrand’s comment: “the students who are most marginalized have the most wisdom to share.” “Most” is relative. No community is without its center and its fringe, its powerful and its less-so.

Although not an indigenous people by the textbook geographic, ethnic, or historical definition, disabled people have been turned “into an essentialized ‘other’ who is spoken for” (Bautista et al., 2012, p. 5). Therefore, indigenous research methodologies may be a viable way to locate disabled people themselves “at the location where research is conducted and discussions are held [to] serve as a major link between fully understanding the historical vestiges of discrimination and the present day manifestation of that discrimination” (Parker cited in Dunbar, 2008, p. 98). It was not so long ago, after all, that Josef Mengele performed his sadistic “experiments” on people with dwarfism (just one example of a heartbreaking many) or, right here in the United States, “researchers injected cancer cells into 19 old and debilitated patients” (Stobbe, 2011) at a New York Hospital, without the informed consent of the patients themselves, to see if their bodies would reject them . And if anyone wants to argue that that, in addition to being a post-racial America, we’ve moved past discriminating against people with chronic illness or disability, I’ll point out that “less than one-half of individuals aged 21 to 64 with a disability” (Brault, 2012, p. 10) are employed, and those who are working earn less than those without disabilities (Brault, 2012, p. 12). As many people have pointed out before, the depictions of disabled and chronically ill people in the media continue to come from a narrow menu of reductive options: the scary, disfigured villain; the noble, sexless saint; the angry, vengeful victim. But that’s a blog post for another day. What about participatory action research, this private school of mine, and students with disabilities?

Cue the record scratch again: There isn’t a huge population of (physically) disabled students at my school. That’s always struck me as weird. After all, according to the U.S. Census Bureau, 8.4 percent of Americans under the age of 15 have a disability, and 10.2 percent of Americans between the ages of 15 and 24 do (Brault, 2012, p. 5). And yet, a casual conversation with a couple of colleagues who have worked at my school for about 75 years between the two of them yielded only a handful of names of kids with disabilities. As far as I know, in the six years I’ve worked at my school of 700-ish students, there has never been a student who used a wheelchair or other assistive mobility device. There have been, and are, a few students with hearing impairments, vision impairments, and chronic illness (asthma, juvenile arthritis, seizure disorders) but not at all in proportion with the census data. That, too, is a blog post for another day: Is this disparity particular to our school or is it perhaps seen in other independent schools? And why?

What we do have is a big–and, I would say, growing–population of students with attention deficit disorder, attention deficit hyperactivity disorder, autism spectrum disorders, dyslexia, learning disabilities, or other disabilities associated with executive functioning. I’m going on observation, not school data, here but I would guess that In this regard, our school is at least representative of the census data, which, as of 2010 reported that “2.3 million children had difficulty doing regular schoolwork (6.2 percent)” and that “about 692,000 had a learning disability, 1.9 million had Attention Deficit Hyperactivity Disorder (ADHD), and 1.7 million had an intellectual or developmental disability or condition” (Brault, 2012, p.13). These students struggle in school. I might even argue that their struggle is made all the more difficult by the competitive, high-pressure, college-focused, and achievement-driven culture of a school like ours.

There’s Dr. Bertrand again: “The students who are most marginalized have the most wisdom to share.”

Our school is in the midst of a great deal of change. Three years ago, we moved to a block schedule. This year, we’re rolling out the use of Canvas, a learning management system that will digitalize teacher-student communication, assignment submission, collaboration, and assessment and grading. Next year, we will shift all instructional materials and texts to electronic tablets. We are in the midst of a comprehensive curriculum review and redesign, including exploring a new capstone project/experience for our graduating seniors. I have served or currently serve on the committees at the center of these activities, and I can say without reservation that we have been thorough, sincere, and utterly student-focused in these efforts. My colleagues have asked, at every turn, “How will this benefit the students? What do they need?” even when the proposed changes threatened discomfiting change for the teachers themselves. I am proud of my school, my colleagues and myself. And yet: Perhaps are we missing an opportunity to involve the students themselves? Sure, student representatives have visited those committee meetings; the administration is dutiful and sincere in connecting students and soliciting their input. A student served alongside me, the headmaster, parent representatives, and members of the board of trustees on the recently completed strategic plan committee. We ask the students questions, and we listen to their responses. But we haven’t (yet) created a space where the students themselves design the questions, protocols, and experiences of research for themselves, targeting what’s most meaningful to them. And we certainly haven’t specifically sought the students with intellectual and learning disabilities–arguably among the most marginalized students in our school–to share their wisdom. Maybe we should.

So that’s my 1,300-word preamble to the article I cited at the top of this entry. I am so electrified by PAR that I decided to explore the ways that PAR projects have been used with students with disabilities. This article reflects only one such project. In this study, the authors used PAR to examine the degree to which an elementary school was succeeding in imparting social skills to its students with high-functioning autism spectrum disorder (HFASD). The study is predicated on the idea that “one of the central roles of schools is to prepare students for life in the work force or postsecondary education and help produce competent adults” but that “many skills beyond academics are needed to succeed in college and/or the work force, including adaptive social skills” (Ostmeyer and Scarpa, 2012, p. 933). These social skills include “listening to others, following steps, following rules, ignoring distractions, taking turns, asking for help, getting along with others, staying calm, taking responsibility for one’s own behavior, and doing nice things for others” (Ostmeyer and Scarpa, 2012, p. 932). These are crucial skills for making one’s way in the world, and they are central to my own pedagogy. However, children with high-functioning autism spectrum disorder struggle with these very skills, and with disastrous results:  “Although children with HFASD score in the average or above-average range on intelligence measures, 70% to 90% of these children underperform academically in at least one domain, including math, reading, and spelling” (Ostmeyer and Scarpa, 2012, p. 933), suggesting that “social skills play an important role in … academic performance and that social enhancement may positively impact academic skills” (Ostmeyer and Scarpa, 2012, p. 933). The stakes are higher than that for these students: deficits in social skills can lead to low self-esteem, depression, and anxiety (Ostmeyer and Scarpa, 2012, p. 933). Children who haven’t developed these skills “are also more likely to be rejected, teased, and bullied by peers” (Ostmeyer and Scarpa, 2012, p. 933). Which, as anyone can tell you, leads to anxiety and depression, which don’t incline a kid to get out there and get cracking on buffing up those social skills. It’s a particularly vicious cycle.

The authors engaged a process of PAR to “gather information on the need for social skills interventions in schools, potential benefits, and barriers to school-based implementation.” True to the values of PAR, the researchers involved stakeholders–in this case, not students themselves but 14 school staff members (“the school principal, a school psychologist, general and special education teachers, special education aides, and teachers of ‘specials’ (i.e., art, library)”) (Ostmeyer and Scarpa, 2012, p. 935) and two mothers of children with HFASD at an elementary school. I’ll be brief in summarizing the research design, results, and discussion here, because I want to get to the real takeaway, which is the potential for this study as a template for a PAR study at my school.

Research Design:

  • Participants attended either a focus group or an individual meeting, each lasting 60-90 minutes.
  • At the meetings, researchers defined social skills, emphasized their importance, and shared current research about social skills.
  • Participants completed a questionnaire and then participated in guided discussion about how social skills programming could be implemented in their community.
  • Classroom observations were conducted of two male students.
  • Qualitative and quantitative results were compiled and presented to the school stakeholders.
  • A tentative plan for the implementation of a social-skills program was designed.

Results from the Interviews:

  • Participants agreed that social skills were important.
  • School staff participants were wary of programs or interventions that removed students from the classroom.
  • Staff participants were worried about taking time away from core academic subjects.
  • Staff participants were worried about the time needed to train staff.
  • Staff participants urged a model that was inclusive; that is, it didn’t target the students with HFASD but included the whole class.
  • Parent participants indicated that their students might need individualized social-skills instruction.
  • Parent participants worried that teachers would be uncooperative with a new program because of the time crunch for training or other classroom responsibilities.

 Results from the Observations

  • The observed students demonstrated deficiencies in most of the skills listed as important social skills in an earlier part of this discussion (following directions, etc.).
  • Peers of students with HFASD were observed to be patient, understanding, friendly and inclusive but may have inadvertently reinforced some of the disruptive behaviors of the students being observed.

Discussion/Findings:

  • Stakeholders agreed that social-skills training was both wanted and needed in the community.
  • Stakeholders agreed that lack of social skills negatively affected academic performance.
  • Stakeholders believed that educating peers about how to treat their peers with HFASD and educating them about the HFASD characteristics would help the social interaction.
  • Stakeholders worried about time away from core academic instruction.

Although this article gave me some very practical insight about how to design a mixed-methods PAR study (How many participants, how many meetings, of what kind? What methods, what instruments?), I’m left with so many questions at the end: the article does not discuss the specifics of the program tentatively designed by the research participants and presented to school decision-makers, nor does it discuss to what extent the program was implemented. As for the observational element of the study, the article uses a mysterious passive voice (“observations of two male students with HFASD were conducted” [emphasis mine]) (Ostmeyer and Scarpa, 2012, p. 936), suggesting that that component of the research wasn’t so participatory after all, but rather conducted by the “official” researchers. Finally, and most importantly, I’m wondering if a study like this, which includes as participants not the marginalized people themselves but people one step removed from the marginalized people, is really true to the purpose of PAR as I understand it.

To me (a newbie to the subject, I grant), this seems like PAR-lite. This study doesn’t locate the power of research with the people on the low(est) end of the power differential; it didn’t yield any actionable, concrete, findings that could be or were implemented to immediately and in real ways benefit the children with HFASD; and because it doesn’t fulfill the goal/promise of PAR, which is to empower the very people who have been disempowered. Students with HFASD, like students in urban schools, can be “dehumanized, denied agency, and not allowed to speak on schooling conditions from their perspective,” students with HFASD. To address this marginalization with  underserved urban youth, the Council of Youth Research “works to empower students to become agents of change” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 4). The people who stand most to benefit from the research into social-skills programming for children with HFASD are children with HFASD, and here they are being spoken for here by the adults in their lives. This, of course, brings me right back to Aurora Levins-Morales’s Medicine Stories (1998), in which she argues that “the disempowerment we all experienced as children has little outlet. We are taught to obey until our own turn comes, with few opportunities to politicize the experience and critique it” (p. 51). Of course, the parents and staff who participated in this research love the children and want to support and serve them, but they are not, ultimately, the true stakeholders at the center of this line of inquiry. Aurora Levins-Morales’s (1998) again: “The fact that many parents are deeply loving, fair and committed to their children’s well-being does not change the fact that this is largely a matter of luck for the child, that she or he has almost no control over the conditions of daily life” (p. 52). I would imagine that control is further diminished when you’re a kid with HFASD who struggles with social skills.

But PAR is–or has the potential to be–precisely that opportunity for children to politicize and critique their experiences as students! To me, this study is PAR by proxy. I think true PAR depends on that essential tension between the power a person or group has historically held and the power presented by the very act of PAR. It has to challenge, if not invert, the power differential. If you’re doing PAR with people close to the people who stand to benefit most, people related to the people who have been the most silenced, I’m not sure it works. Yes, teachers and parents are to some extent powerless; they can’t make sweeping curricular change on their own without OK from the top–and there are many layers of top on top: the school administration, the school district, the board of education, the state government, the federal government. But I assume that teachers and parents have some voice and some kind of venue–PTAs or staff/faculty interactions–that the children at the heart of this issue do not. I don’t know how old children have to be before they can be involved in meaningful PAR (in fact, a classmate of mine asked this very question last night! Thanks, Jeff!), but I suspect the answer has to do with what you want out of the PAR, as well as who has to “buy in” to the findings to effect change and how those people feel about children. I also suspect that the answer might be “not that old.” I suspect that elementary-aged children can be meaningful participants in PAR.

So back to my school and the idea that’s hatching in my head: What if the students who have those intellectual challenges I listed were recruited to perform PAR at PCDS? What if they worked to design research protocols and to collect data, and then they synthesized and presented their findings to the headmaster, the board of trustees, the faculty body, the parent body, the student senate, the student body? What if their findings had real and immediate impact on the questions we’re asking ourselves now, which include pressing questions about learning management systems, class size and teacher load, technology implementation in the classroom, elective offerings and student choice in curriculum, graduation requirements, capstone projects, college counseling, and community-building, to name only a few? I imagine these students might have some real, heretofore under appreciated wisdom to share that could have immediate impact on the decisions we as a school are making. And I think the very process could serve to empower these students, who I think may be among the most marginalized students in our generally-not-so-marginalized school.

There are some steep-seeming logistical concerns. Here’s a non-comprehensive list:

  • What would be the criteria for inclusion as one of the participants? Would students have to have an official diagnosis, or would self-identification suffice?
  • How could I recruit students in an ethical, sensitive, appropriate way?
  • Would students be reluctant to participate, for fear of “outing” themselves? Would their parents worry about stigma?
  • Would students be inclined to take on additional work of PAR, on top of schoolwork that might already be challenging because of their intellectual or cognitive challenges?
  • Would students be incentivized to participate without a grade? Could they get class credit for participating? Is that ethical?
  • Would my school administration support or embrace such a project?

I don’t now the answers to those questions, but I think they’re worth exploring. After all, The timing seems perfect (as a school, we are in a period of self-reflection, reevaluation, and change) and the student-centered goals of PAR are absolutely consistent with the values and mission of PCDS. But if I did this, it would have to be the students’ voices front and center–not their parents’, not their teachers’, not their doctors’/therapists’/tutors’/coaches’.

No PAR by proxy at PCDS.

References:

Bautista, M.A., Bertrand, M., Morrell, E., Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the council of youth research UCLA, 115(100303), 1–23.

Brault, M.W. (2012). Americans with disabilities: 2010: Household economic studies. Current Population Reports. Washington, D.C.: U.S. Department of Commerce, Economics and Statistics Administration, U.S. Census Bureau. 1-23

Denzin, N., & Lincoln, Y. (2008). Introduction: Critical methods and indigenous inquiry. In Denzin, N., Lincoln, Y., & Smith, L.T. (Eds.) Handbook of critical and indigenous methodologies. 1-20.

Dunbar, C. (2008). Critical race theory and indigenous methodologies. In Denzin, N., Lincoln, Y., & Smith, L.T. (Eds.) Handbook of critical and indigenous methodologies. 85-99.

Levins-Morales, A. (1998). Medicine stories: History, culture and the politics of integrity. Cambridge, MA: South End Press.

Ostmeyer, K., & Scarpa, A. (2012). Examining school-based social skills program needs and barriers  for students with high-functioning autism spectrum disorders using participatory action research. Psychology in the Schools, 49(10),p.932-941.doi: 10.1002/pits.21646

Stobbe, M. (2011, February 27). AP impact: past medical testing on humans revealed. The Washington Post. Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2011/02/27/AR2011022700988.html

 

The Dynamic Process of Kindergarten Transition

Rimm-Kaufman, S. & Pianta, R. (2000). An ecological perspective on the transition to kindergarten: A theoretical framework to guide empirical research. Journal of Applied Developmental Psychology, 21(5), 491-511.

The transition into kindergarten signifies a very import step in the lives of young children and their families.  Although many children in the United States attend various types of preschool programs, the transition into formal schooling is a big step for children that have never had preschool experience as well as for children that have had the opportunity to engage in a preschool program.  Rimm-Kaufman, Pianta (2000), conceptualizes the importance of transition programs or activities in the year prior to kindergarten, and offers an approach to these activities that focuses on an ecological perspective.  This approach included three main areas of focus.  First, a focus on relationships between children and their environment, such as schools, peers, families and neighborhoods (Rimm-Kaufman & Pianta 2000).  Second, measures of school readiness need to take into consideration the effects that these relationships have on the child.  Third, Rimm-Kaufman & Pianta (2000) discuss the importance of examining  on how these relationships changes over time and have an effect on the child and their transition success.

There has always been a research interest in the process of children transitioning from home or preschool into formal schooling, but the popularity of this topic has increased even more in the educational research field in the last 10 years due to the dynamic nature of our current educational system as well as the changing landscape of our family structures.

The expectations for early learners are continuously changing, increasing, and developing as mandates from federal and state policy makers are implemented to try to raise the bar for educators and their students.  Along with demands for higher level of academic performance, kindergarten students also have many social-emotional adjustments to make during this transition year.  Independence from their parents, being alert and attentive for five hours a day in school, and transitioning from mostly parent – child relationships, to forming and maintaining relationships with their peers are all significant social-emotional adjustments (Rimm-Kayfman, Pianta, 2000.

Other factors that promote the popularity of research in this area of education are the increased number of children between the ages of 4-7 in our country. The United States has shown a two-fold increase in the population of preschool age children from 1973-1993.  Changes in family dynamics are also factors that warrant research in this area.  There are many more families now than a decade ago that have single parent households or both parents working when they have small children.  Also, there is growing population of children that are subject to the consequences of welfare reform and are experiencing more stressful home lives (Rimm-Kayfman, Pianta, 2000).

With all of these factors taken into consideration, it is clear to see that educational systems need to create educational reform that includes a comprehensive program that takes into consideration all of these risk factors as preschoolers transition into formal schooling.  The goal of new research would be to help students begin their kindergarten year with as much support as possible given their family dynamics and experiences prior to kindergarten to set them up for success.

The authors noted that with all of these changes, the way this transition process is studied is evolving.  This evolution has everything to do with the increasingly complex family dynamic and other societal factors.

When researchers first began to look at the transition period into kindergarten, they often focused on child characteristics.  In other words, they focused on gender, behavior, ethnicity, etc.  More popular now is the idea that there are far more impacting elements in a child’s life that can have an effect on the success of their transition into kindergarten.  Researchers now are focusing on societal influences, such as programs to help the child transition, such as meet the teacher or hello parties, quality of preschool experiences, and interactions between the parents and the child as well as parents and the teacher (Rimm-Kayfman, Pianta, 2000).  The authors argued that the approach to looking at what determines the success of transitioning into kindergarten is complex and should take on a more ecological approach.     An ecological approach can be best understood as looking at persons, families, cultures, communities, and policies and to identify what the effects are on the child.

All of these factors can help researchers conduct research to better inform policy makers and school districts not only on the importance of preschool to kindergarten transition programs, but also help develop them so that they are created for the specific needs of the community they service.

 

 

 

Developing the Developmental Instructor

Kozeracki, Carol. (2005).  Preparing faculty to meet the needs of developmental students. New Directions for Community Colleges. 2005 (129), 39-49.

My exploration of developmental education continues as I shift my attention toward instructor preparation and development. Higher education faculty, unlike K-12 instructors, are not required to have any specific education certification in order to teach.  All instructors will have strong content knowledge, as that is the emphasis for being hired (whether in a full-time or part-time role) to teach at a college or university.  But, instructors who teach developmental courses recognize that there is a unique skill set to meeting the needs of this diverse student population.  Unfortunately, many developmental educators do not have adequate preparation to meet student needs.  Carol Kozeracki’s “Preparing Faculty to Meet the Needs of Developmental Students” explores strategies to prepare faculty to better serve developmental students.

Article Summary

This article explores English developmental education faculty members’ attitudes towards professional development.  The study includes interviews with 36 English faculty members who teach developmental courses, representing seven community colleges in two states (one East coast and one West coast).  Each community college has large enrollments, exceeding 15,000 students.  The structure of the developmental programs are varied as well – centralized, decentralized, and mixed models (Kozeracki, p. 39).

The study explored three areas or strategies for faculty preparation: graduate programs, internal professional development opportunities, and professional associations.  For each of these areas, the author interviewed developmental English instructors to gain feedback on their attitudes about each area and solicited recommendations on how to strengthen each.

For graduate programs, this study concluded that “there is a significant gap between what is learned in graduate school and what they need to know to facilitate student learning” (p. 48).  Furthermore, graduate programs for English instructors should include additional training on how to teach grammar, how to properly design a lesson, and strategies for both recognizing and working with students with disabilities (p. 48).

Regarding professional development programs, Kozeracki concluded developmental English faculty are most responsive to departmental level programs that provide strategies for meeting  immediate classroom needs and to informal dialogue about teaching and learning practices (p. 48).

Finally, developmental English faculty are least interested in professional associations that provide information and resources that are solely theoretically based; they desire to have practical applications to the developmental classroom (p. 49).

Strengths and Critiques

The audience for this study is most likely faculty development professionals, leaders of centers for teaching excellence, department chairs, and administrators responsible for hiring and developing faculty.  One strength of the article is that it provides tangible suggestions to improve professional development programs for developmental English faculty at a community college.  Specifically, it recommends that “more time should be made at departmental meetings in which faculty discuss pedagogical issues” (p. 46).  The author recommends only using outside speakers who focus on issues of “genuine concern” to the whole faculty (p. 46).  The faculty responders also indicated that more opportunities to engage with faculty outside their respective discipline would be beneficial (p. 46).  The most radical suggestion is for colleges to set aside one to two-hours per week when classes are not offered for faculty to have more focused departmental meetings (p. 46).  This suggestion aligns more to that of the K-12 model with common planning periods.  I appreciated this section of the study as it provided very concrete recommendations based on the responses of the 37 interviewed English faculty members.  The recommendations are relatively low-cost and simple to implement and may have a positive impact.

Despite the recommendations I found useful, I have many critiques of the article which limit its effectiveness.  First, little is shared regarding the demographics of the interviewed faculty members.  I think their length of service to the institution and their own level of preparation and training to teach developmental education are significant pieces of information that would effect responses as to what professional development and training options are beneficial.  Furthermore, faculty attitudes toward graduate school preparation is completely dependent upon each of his/her personal experiences.  Relating this study to myself, my undergraduate degree is in English and my master’s preparation is in Education.  Consequently, all of the recommendations offered in this article regarding graduate programs would not apply in my case, as my graduate experience provided me with the content to be a successful instructor.  Second, I thought the study was very weak in identifying which of the instructor training programs may have an impact on student learning.  I do not think simply stating that developmental English instructors believe workshops taught by peers are more beneficial is enough evidence to justify replicating that type of professional development opportunity.  Yes, the instructors like it.  But, and most significantly, is there evidence to show that participation in the specific professional development opportunity impacted the classroom and student learning in any manner?  The study was very weak in linking the training to classroom modifications and student success.  Finally, I was very disappointed in the author’s initial description of the current state of developmental education. I recognize the article was penned in 2005; however, I was still surprised that the author included a refernce to C.J. Hardin’s work “Access to Higher Education: Who Belongs?”  Kozeracki quotes Hardin’s work detailing six categories for students who require developmental coursework: “poor choosers, adult learners, foreign students, handicapped students, and ‘users’ who lack clear-cut goals and are attending college more for purposes of avoidance than achievement” (Kozeracki, p. 40).   This categorization is appalling, as it implies that having a disability or being foreign or being an adult learner or not having a defined goal are indicators for needing developmental coursework.  Hardin, who I am not familiar with, is obviously not familiar with the community college mission, as those categories describe any number of high-achieving and excellent students within our system.  Inclusion of this categorization made me question Kozeracki’s understanding of the developmental education mission and core principles.

My Take

As an administrator who oversees a center for teaching and learning at a community college, I found this research helpful as it did provide me with tangible, concrete strategies to enhance our professional development programs within GCC and the Maricopa system.  But, further research is needed to identify which of these strategies may best impact instructional practice and student leaning outcomes.  With limited resources and faculty members’ limited time, those activities with high impact and low cost/effort may be ones to implement first.  So, additional study is needed to determine which of the strategies and recommendations would have the greatest impact.

I also think the study should be expanded beyond English developmental instructors only.  One could explore the attitudes and perceptions of math faculty who teach developmental courses. Their preparation and needs may be different, requiring unique strategies to meet their discipline and classroom needs.

Finally, this article made me question our faculty hiring practices in higher education.  For the most part, faculty members are hired based on discipline expertise, while teaching expertise is desired, but not a must.  This practice has to change regarding our developmental student population.  Collectively, as a college community, we need to be more intentional of who we hire and why we are hiring them.  Potential faculty members with degrees in education or with significant teaching preparation should be moved to the front of the pool for consideration.  Faculty positions should be posted with requirements beyond that of content knowledge only, but best teaching practices should be required and then demonstrated through the process.  We owe it to our students to have the best and brightest instructors with both the content and the teaching expertise to help our students to achieve.

Education, Equity, Excellence; Research Blog

Research Publication Blog Post Two

Reference:

Arauz, J.C., (November, 2012).E3 Presents: Education, Equity, Excellence- Three Part Video Series. YouTube, Part 1: What is educational excellence? Retrieved from: http://www.youtube.com/watch?v=ZBEI6ilDv-0

 

Strengths, Contributions and Ways to Improve; Graphic Organizer

Organization: The video was well organized and directed. The narrator did develop the argument and the animation was creative.

Contribution to Field: The video’s contribution to the field was worthwhile and significant.

Literature Review: The video did not provide a literature review.

Theoretical Framework/Lens: The video clearly demonstrated coherence. The research focused on issues our nation faces with its current education system.

Data Collection: Data was collected from inner city schools with a predominately African American and Latino population.

Analysis: The video had a profound impact on current education action research.

Findings: The findings of the video were inconclusive however, the research does outline some assumptions about culturally relevant pedagogy and its meaning for intercultural learning.

Discussion/Conclusions: The video provides a formula for creating successful plans aimed at intercultural learning.

Minor Editorial Comments: No editorial comments for the article.

Miscellaneous: No miscellaneous comments for the article at this time.

 

 

Culturally Relevant Pedagogy: Ingredients for critical teacher reflection

In the video, Education, Equity, Excellence (Part one, 2012) founder Dr. Juan Carlos Arauz discusses the problems and issues our nation faces with its current education system. This video is aimed at responding to the need of culturally relevant pedagogy. In this first video, Dr. Arauz poses the solution to the educational crisis in this country through a 3 part YouTube video series. This blog post will analyze the first video in the three part series. According the foundation’s website, Dr. Juan Carlos Arauz is the founding executive director of E3: Education, Excellence & Equity. E3’s mission is to redefine educational expectations so that every student, regardless of starting point, is engaged and thriving in a school that practices a culture of academic success for all (http://www.e3ed.org/about-e3).

This video was significant to me because it caused me to reflect on ways to create culturally relevant pedagogy. Furthermore, the video was especially significant to my area of study, as it examines how critical teacher involvement is, as it relates to culture and the classroom. The video contains implications that teacher education and creating culturally inclusive schools and classroom environments is relevant. The video challenges educators to examine the impact of cultural resilience. The video offers a very unique look at how a student’s journey through school takes on many life challenges. But making their way through those challenges is exactly the skills employers are looking for. I personally feel that it was very insightful to examine how students in bad situations show resiliency in their everyday struggle. They show this resiliency through:

  • Critical analysis
  • Adaptability
  • Cross-cultural and intercultural communication
  • Collaboration and innovation

This study has caused me to critically examine the relationships between student involvement and the educational achievement gap.

Another dimension of the video was low income and immigrant students need for culturally relevant pedagogy. Specifically, recommendations are offered for teacher preparation and in-service teacher professional development. I learned that educators must reconceptualize the way they teach in order to serve a more diverse student population. The video also gave some very interesting statistics I had not seen before. These statistics serve as a wake-up call for educators and administrators, in the fact that cultural sensitivity and diversity training programs should be a part of every educational program.

This video has a direct correlation to my own experiences. I began my teaching career in a predominantly minority school. As a new teacher, it was very important for me to understand the culture and teach the core and reconceptualize my teaching strategy. I did a lot self-reflection as some things worked and some things did not.

In conclusion, I firmly believe the impact of the three part video series on education research is profound. It opened my eyes to the need for culturally relevant pedagogy. As stated in the conclusion of the video “In order for student to be prepared for 21st century needs, educators must show students how to use their everyday skills so they can proudly stand up and say I am innovative, culturally resilient, adaptive, collaborative, and cross culturally aware.” I believe this statement speaks to how this knowledge can impact not just the teacher but also the student and learning community.

Active Learning in Health Professions

McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A., Mumper, R. J. (2014). The flipped classroom: a course redesign to foster learning and engagement in a health professions school. Academic Medicine : Journal of the Association of American Medical Colleges, 89(2), 236–43.

The article, The Flipped Classroom: A Course Redesign to Foster Learning and Engagement in Health Professions School (McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A. & Mumper, R. J., 2014) is about how the University of North Carolina (UNC) Eshelman School of Pharmacy redesigned the course, Basic Pharmaceuticals II, using a flipped classroom model. The course redesign was “inspired by a desire to transform the educational experiences of our students and to meet students’ requests for enhanced in-class active learning experiences” (p. 237). The article discussed what changes were implemented in the course redesign; they include, replacing in-class lectures with on-line videos to watch outside of class and then spending valuable class time on active learning exercises. The three main element focal points include, offloaded content (recorded videos, etc.), student centered-learning and appropriate assessments.

In trying to determine if implementing a flipped-classroom model would be effective the researchers obtained approval from the UNC institutional review board in order to administer pre- and post-surveys regarding demographic information, students’ perceptions of active learning activities, preferred curriculum delivery format and engagement. In addition, they collected data on exam scores and additional assessment tools and compared the outcomes of traditional classroom format (class of 2011) to those that participated in the flipped classroom format (class of 2012). The overall findings validated that overall student learning increased after participating in a flipped-classroom format.

Review of Strengths and Contributions

Organization – The organization of this article was well constructed. I particularly valued that the authors compared the traditional lecture and course design to the newly implemented student-centered pedagogy.

Contribution to Field – The authors acknowledge that there have been many significant changes to how healthcare is delivered and discussed the increasingly complex healthcare system, yet state, “little has changed in the way that education is structured and delivered to aspiring health professionals” (p. 236). This articles contributes to the field that by incorporating active learning into a classroom, it can enhance learning, improve outcomes, and fully equip students to address 21st-century health care needs.

Literature Review – In my review of the article what I found most interesting is what is happening in traditional classrooms. Some of those happenings included, “students’ attention declines substantially and steadily after the first 10 minutes of class and that the average attention span of a medical student is 15 to 20 minutes at the beginning of class. Although students’ attention returns in the last few minutes of class, they remember only 20% of the material presented during that time. Furthermore, passive learning in hour-long lectures often bores students and can deprive them of rich educational experiences” (McLaughlin et al., 2014, p. 236).

Analysis/Finding – The authors compared pre- and post-course survey responses, course evaluations responses and final exam scores between the traditional and flipped-classroom cohorts. The finding conclude that the students in the flipped classroom evaluated the overall class better in areas such as, comprehension of material, engagement during class, preparedness, etc.

Discussion/Conclusions – The authors discussed in detail how the course was being redesigned but more importantly, in my opinion, honestly discussed the time commitment on both the instructor and the teaching assistants (TA). While the initial time commitment by the faculty is significant, it will decrease in subsequent years, however for the TA the time commitment will remain static. By showing the time commitment implications, I feel that future teachers will feel motivated to incorporate active learning techniques and feel confident that in subsequent classes or years they will not need to devote so much time on planning for the same class material.

Miscellaneous – What I particularly found valuable was some of the next steps and changes that will be implemented for the spring 2013 class. Some of those changes include: no longer considering the textbook to be required reading, replacing the student presentations and discussion with a new 30-minute active learning exercise, and creating “an online 411 Pharmacopedia to be used as an information portal for expanding concepts, new technologies, breaking news, current clinical trials, new drug products, and Web links” (McLaughlin et al., 2014, p. 242). This showed that the authors were incorporating ways to improve the course.

Response

In my blog post from last week, I reviewed the article Does Active Learning Work? A Review of the Research (Prince, 2004). While I am in the infant stages of researching IF and HOW active learning works, I happily find myself being drawn into wanting more information. Some of my curiosity revolves around how students balance their in-class time with their out of class responsibilities and what are the long-term material retention statistics for those who participate in an active learning setting versus a traditional lecture classroom setting.

I am interested in implementing more active learning sessions for a course that I co-direct for fourth year medical students. During their final year of medical school, the students are in their elective rotations locally and across the country. In the spring, prior to graduation, we bring them back for a two-week course that is designed to help prepare their transition into residency. There are some active learning sessions during these two weeks, but approximately 80% of the course sessions are lecture based. In working with the director of the course, we are trying to develop sessions that involve more student involvement and particularly enhance ways to assess their clinical skills. I feel this article (McLaughlin et al., 2014) and the study described within can help persuade administration to allow us to achieve our goals of designing more active learning sessions and move away from the traditional lecture-based sessions.

References

McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A., Mumper, R. J. (2014). The flipped classroom: a course redesign to foster learning and engagement in a health professions school. Academic Medicine : Journal of the Association of American Medical Colleges, 89(2), 236–43.

Prince, M. (2004). Does active learning work ? A review of the research. Journal of Engineering Education, 93(July), 223–231.

Making the Transition: High School to College

Reference

Venezia, A., & Jaeger, L. (2013, Spring). Transitions from High School to College. Future of

     Children23(1), 117-136.

In continuing my review of scholarly writings this week I found the article Transition from High School to College that provided information that directly relates to my line of inquiry. The title from the article led the way as a direct intro to the subject matter that was being given by the authors. The focus of this piece was to provide research and insight on the current trends of providing interventions to improve access into higher education for high school students in the United States. Venezia and Jaeger (2013) presented their research and information by looking at the state of college readiness among high school students, the effectiveness of programs in place to help them transition to college, and efforts to improve those transitions.

In looking at the formulation of the research presented in this reading, it was very easy to see from the onset that the authors were first focused on using more quantitative data to support their findings. The researchers looked at statistical data from various sources including the National Center for Educational Statistics, U.S. Department of Education, College Board “SAT Report”, and several others. I felt the research was conducted from a more analytical approach than anything. Although the text provides some excellent support to the author’s positions, it seemed to be a report more motivated to impact policy makers. The study showed very little humanizing elements, telling me it was a more data driven approach with research methodologies used for this report.

The information presented to readers was turned to help one understand first and foremost, that there is a problem in the U.S. with students being ill prepared for college entrance. The report also lightly approached the idea that social inequity continues to hamper access to college in underserved communities in the U.S. The paper leaves readers contemplating the effectiveness of current measurement tools for college readiness, because it is something that history shows challenging to track. The report also helps readers to understand that college transition challenges for high school students is recognized at the national level. Hence, there is state and federal funding currently being used for success programs like TRIO, Early College, Gear UP, and Upward Bound programs. The article also reflects on common core standards, the push at the national level for college preparedness, and also presents readers with the idea that there is not one particular fix to guiding students in college and career readiness. The end findings of this article can be summed up in one of the author’s final statements.  According to Venezia and Jaeger (2013) “While great variation in approaches and implementation strategies will no doubt continue, the field would benefit from a more comprehensive and consistent method for learning what works across different types of reforms—for example, using similar definitions and metrics—to help clarify what is transportable, effectively, across different contexts and scaling needs” (p.132).

As a reader, the authors of Transition from High School to College did an excellent job of initially capturing my attention by presenting their stance and position on what the readings was going to present. The piece itself was very coherent from start to finish, and all topics were placed in a safe fashion, to help the reader understand both problem, potential solutions, and end findings. The data was structured into this writing nicely to help support the authors points and to help users continue to build on understanding the issues of high school to college transitions in our country. One of the strongest points made in this article came as the development of the argument was laid out. As a reader, I could feel that there was not going to be any healthy final recommendations to solve the issues being presented.

In reflecting on this reading, I think it may contribute to my research and field of inquiry. Was this article worthwhile? I would say yes to an extent. I will keep it in my archive for reference points that I felt were very sound. I will say the strength of the argument was not supported as well as I thought it could have been, because of the lack of connecting the reader with the human side of the challenges being presented. I know not all scholarly writings are intended to appeal to a reader’s emotion, but if the authors could have provided a more human element, this reading would appeal to me even more.

Some of the key items in this article were highlighted in the way the authors framed their argument and presented their story while supporting it with data. As I read through the material, the author helped me to see some of the challenges that are faced in measuring the topic at hand. Another part to this writing that impressed me was the author’s last take on the analysis presented. In addition to directly supporting academic preparation for students, capacity-building efforts need to focus on ensuring that large comprehensive high schools have strong college-going cultures, on providing the necessary professional development for educators to help all students meet college readiness standards (Venezia & Jaeger, 2013), was well stated. Again, not robust findings, but the author helped me understand there is more research needed in this area.

The authors of this paper made good connections with analysis and material being presented. As a doctoral student who is looking at research methods and tools, I defiantly found some of the examples and materials used as potential tools for my future research efforts. I found clarity in the way some of the material was presented, but also questioned some portions to the writing. I learned from this piece that sound data and resources can help the reader better understand the issues you are looking at. But when topics are presented that have little findings to help support current actions being taken, it can call for challenges in trying to transfer your message to the intended audiences.

I selected this article to review because I found some positives and negatives in how it read. My final thoughts were that the use of statistics and data can do an excellent job in helping support an issue you want people to recognize. The authors of this article did an excellent job at using national statistics and measures to help show readers the impact of the issue at hand. In my professional setting working in higher education, I have to work with top level leadership occasionally to present my view or ideas, and to gain support or funding for projects or other needs. The use of data to support my argument seems always to play a vital role in the impact I can have on my audience. The use of solid supporting quantitative data to help support any measure can always help. On the other hand, this study might be able to build on its argument and position by bringing more qualitative research. Story telling focused on the collective impact of the challenges being faced and the results to be had by some of the programs discussed may help this research become even sounder for the future. The result in reflecting on this article; I had some positive takeaways to help me in the review of research practices, but still many questions to ensure I can be a successful academic researcher in time. a successful academic researcher in time.

The Importance of Student Motivation in Short-Term Study Abroad

Allen, H. W. (2010). What shapes short-term study abroad experiences ? A comparative case study of students ’ motives and goals. Journal of Studies in International Education, 14(5), 452–470.

Summary of Study
Having been a participant on a short-term intensive French language study abroad program twice myself, I found Allen’s study to be very relatable.  Allen chose to examine the goals and motivations that shaped two female students’ short-term study abroad experiences, specifically examining their language learning while participants on a 6-week program in Nantes, France.  If there are naysayers out there about the ability of short-term programs to instill long-term cross-cultural benefits in students, it seems that there are even more academics who contest the true language skills students are able to derive from participating on a short-term study abroad program such as this one and the two I completed.  Consider the literature Allen cites, including Davidson (2007), “[Davidson’s research] claimed that for programs of 6 weeks or less, development of linguistic and cultural proficiency is extremely unlikely to occur” (p. 453).  Certainly I do not think that fluency in a language is something to be expected by participating on a program of such a short length, but reflecting on my personal gains, I cannot help but think there are other gains made, such as the acquisition of more colloquial and current vocabulary, as well as gains in self-confidence leading to continued study of the language.  This latter point was something that Allen actually examined in this study as well.

Allen’s findings indicated that the motives surrounding each students’ reason for pursuing study abroad seemed to have the most impact on determining to what extent the students’ language levels developed during the study abroad program.  Allen’s connection to Lompscher’s (1999) characterization of the types of learning motives was very powerful at contextualizing this finding, explaining that “Molly’s [learning motives] were consistent with social learning (i.e., to communicate or cooperate with others) and higher level cognitive motives (i.e.; arising from her intrinsic interest in learning), whereas Rachel’s were consistent with lower-level cognitive motives (i.e., learning with the goal of obtaining a result)” (p.467).  Therefore, it was logical to see that after the program, Molly went on to declare a major in French and was considering moving to France upon graduation, whereas Rachel ended her study of French after she completed her minor in it, something she had cited as a reason for participating in the study abroad program in the first place.

Strengths and Critiques
Though the research questions posed in this study are not terribly unique as language acquisition during study abroad programs has been studied before, I found Allen’s use of an active theory perspective to be quite revealing in the study’s findings, namely attributing importance to students’ own goals and motivations for participating on a study abroad program.  Overall, I find the article to be well organized and developed, leading to very logical conclusions and important discussion for the field (particularly for advisors and faculty leading short-term programs, who are perhaps best positioned to work with students on identifying their goals and motivations) however, the fact that Allen only examined two students on a single program makes drawing any large-scale conclusions rather tough.  Methodology included recruiting eight participants and then hand-selecting two females based on shared characteristics such as GPA, little prior travel experience, and similar levels of French language skill. I presume this allowed for an even ground for comparison, however I wonder if bias might have been present given that this does not seem very randomized.  The study’s main source of data came from participant blog entries and three interviews.  A pre-trip questionnaire and one administered during the program’s final week were also used.  The researcher also explains that she served as the program director for the trip and interacted with participants weekly, “allowing establishment of trust,” however I think that this, too, might have introduced some level of bias when analyzing data.

Relation to Personal Experiences
As I indicated, since I participated on a short-term French language program, it was helpful for me to reflect on my own motivations for pursuing a study abroad program and to think about how much I feel I learned on my program in terms of language acquisition.  For me, participating on the study abroad program was not an option but something that I felt I had to do in order to come closer to my overall goal of one day being fluent in the language.  This seemed similar to Molly’s motivation as Allen indicates, “What led Molly to participate in study abroad was that full immersion was critical for achieving French fluency” (p. 457).  However, like Molly, I fully acknowledged that participating on a short-term program was not going to make me become the fluent speaker that I desired to be.  Rather, I saw it as an important next step in bringing me closer to that goal.  After having 4 years of French study, it was time to take what I learned in the classroom and put it into practice, which was an important step in building my confidence at speaking French.

At the end of the program I felt pleased with what I managed to accomplish on my 4-week program.  Though I was not fluent, the biggest gains for me were that 1) I had connected with French people my age who taught me how young French people really speak (colloquial and slang variants as opposed to the textbook French I had been learning) and 2) I had succeeded in communicating with a variety of French people in multiple aspects of their society, not the least of which was passing two courses taught completely in French at a French institution.  These successes were critical for me in building my self confidence in continuing to speak and learn the language, and I think, helped me to make greater gains in the language at a quicker rate than I was learning in a traditional classroom setting.  I liken this to the fact that, like Molly in Allen’s study, my goals were more in line with wanting to be able to communicate and learn for the sake of learning, rather than merely wanting to complete the program to advance a degree requirement, which is more in line with Rachel’s motivations.

Ideas for My Area of Interest
In Allen’s discussion and conclusion, she raises two excellent points for my area of interest.
First, based on her findings, Allen asks, “How can study abroad curricula accommodate students with varied motives and goals who enact agency in different ways?” (p. 469).  Given that students’ decisions to pursue study abroad opportunities are so varied, how can international educators and faculty directors work together to develop learning outcomes that will speak to these differences?  As Allen discovered in interviews with Rachel, part of the reason that she did not advance as much as she wanted to with her French language skills stemmed from the fact that she was unable to adapt her learning methods to a style outside of the traditional classroom. International educators need to develop ways to identify students who are are not as self-sufficient at adapting to the new learning environments that study abroad incorporates and provide interventions that assist these students accordingly.

Secondly, and this is a topic I have considered for my own research before, Allen discovered that “Blogging can serve as a tool for self-reflection and goal-setting; however, it is evident that blogging without faculty mediation or other intervention is insufficient” (p. 469). Engaging students in meaningful reflective practices before, during, and after their time abroad is something that I believe is necessary to ensure maximum gains in benefits to be derived from participating in study abroad.  However, this reflection must be monitored and leaders of programs must intervene in order to help students make meaning from their experiences and reflections.  One idea I have is the development of an online course module that study abroad participants would engage in during their experience abroad, regardless of what program they were studying on.  This online module would be designed to be a reflection intervention that provides a way for international educators to help students work through the obstacles they might encounter when pursuing their goals and developing their new skills on their programs.

Reference:
Allen, H. W. (2010). What shapes short-term study abroad experiences ? A comparative case study of students ’ motives and goals. Journal of Studies in International Education14(5), 452–470.

Teaching Pre-service Teachers Content Area–easy….Technology–notsomuch

Hubbard, J. D. & Price, G. (2013). Cross-culture and technology integration: Journal of the Research Center for Educational Technology (RECT) 9(1).

 

 

Cross-Culture and Technology Integration: Examining the Impact of a PTACK-focused Collaborative Project on Pre-Service Teachers and Teacher Education Faculty (Hubbard & Price, 2013) is an article that was written about research done with pre-service teachers.  The intent of the research was two-fold.  First, it was to have pre-service teachers create an instructional lesson using the TPACK model as a basis.  Second, it was to determine how those pre-service teachers might incorporate technology into their lessons when they become in-service teachers in the future.

The study was based on research that supported several sub-components of TPACK (Technology, Pedagogy, And Content Knowledge).  The investigators provided research that backs the need for research in five different intersecting categories. After detailing each, it then chose to focus on just two of those for this study; the last two of the five.   Laying out the literature section in this way proved to be one of the weakness of the article.  The most important items should have been listed first and given more weight and research to back up the rationale.  No justification explaining why these two of the five were being targeted was ever explained.

The researchers did go on to describe that they wanted the pre-service teachers create a Learning Activity Type (LAT) project that required the application of inquiry based social studies skills.   They also mandated the use Microsoft Photostory 3.0 for the technology component.  The pedagogical basis was the concept of culturally responsive instruction which corresponded to the social studies content knowledge of multicultural and global perspectives.

The requirement for the pre-service teachers was to interview a foreign-born person then use internet research skills to gain additional information about the country from which that person came.  Then they were tasked with organizing it and using Microsoft Photostory 3.0 to create their final project which would tell the story about their interviewee’s life, culture, and heritage.

There were 83 students who participated in this project all of whom were all juniors at a university and enrolled in a K-6 elementary school program.  These students were assigned to one of four classes consisting of about twenty students each.  Each class had one of four instructors.  The students also were assigned to meet, as a class, periodically in the computer lab.  There was a separate instructor available there whose job it was solely to help with the technology aspect of this assignment.  That person kept a journal regarding his observations of the classes for the research project but was not considered an instructor for purposes of this study.

The quantitative data came from two sources.  Surveys were given to the students and their instructors.  The student surveys had nine questions that described the learning experience.  Two versions of Likert scales were used.  One set on the first five questions and another on the second four.  82 of the 83 surveys returned were usable.  The other set of data came from responses gathered from a survey given to the instructors.  Of the four instructors, one of them was also one of the researchers and that person chose not to complete a survey to minimize the bias of the results.  This would be another example of a weakness of the research as there were only four instructors to begin with and one chose to, rightly, to withhold filling out the survey.  That reduced the results by 25% of what they could have been and with such a small sample size to begin with, that may have had a large impact on the results.  A strength, however, is that the other three surveys were sent out to be evaluated by a different set of technology experts as opposed to the researchers working on this project in order to minimize any conflict.  The instrument had a reliability coefficient using Cronbach’s alpha of .832.  The standard error of measurement was found to be 2.033 (Hubbard & Price, 2013).

The results from the first five questions on the student survey showed that the pre-service teachers reported being pleased with the class.  The responses for the first five questions ranged from 86.6%-95.1% answering fairly or very useful for questions such as the types of hand-outs used in class, the usefulness of the class, etc.  The remaining four questions garnered a more varied response with only 37.9% reporting that they were fairly or very much likely to use Microsoft Photostory 3.0 as a teaching or learning tool.  The results of instructor surveyed demonstrated that although they felt “very satisfied” with the course, they felt an overall lacking of their own comfort with the technology tool being used which kept them in a situation where they were unable to help their students to the extent they would have liked.

In addition to the two surveys, artifacts were collected throughout the research project.  The researchers recorded classes on video, held one-on-one meetings, took notes, and held interviews with pre-service teachers.  The results from this study indicated that the project did not overwhelmingly help pre-service teachers view technology as a necessary component when teaching.  The survey showed that it did help them gain an awareness of technology and content knowledge (i.e. the cultural responsive component).  This survey size is too small to be generalizable.

Although this research was focused on pre-service teachers and I want to create a TPACK action research for my fifth grade classroom, I still found many ways I can apply some of these concepts to own project.  One of the thoughts that was generated from the results of the survey was that the pre-service teachers did not understand why they were required to use Microsoft Photostory 3.0.  They saw it only as a requirement to contend with rather than concept to master.  I can definitely apply those results to my research.  My students may respond better to the technology I use in my study if they understand that it is something to learn in and of itself and not just a meaningless requirement.

I also made a connection between this piece and the article Rural Elementary School Teachers’ Technology Integration (Howley, Wood, & Hough 2011).  That article described how the attitude of the teacher was so vital towards the successful implementation of technology.  Although this differs because the instructors did want this to be a successful experience, their survey results showed that they expressed a discomfort with the tool.  They also conveyed that not knowing how to use the technology caused them to be unable to help their students during their projects.  I wonder how much more successful this entire study might have been if the four instructors had been well-versed in the tools they were requiring their students to use.

The final connection for me is the concept of completion.  Although not a requirement in one sense, the students were not allotted the time to be able to share their projects.  While that may not have been a requirement needed for successful implementation in the authors’ minds, I wonder if it might have made the project more appealing to the students and therefore raised the scores of some of the pre-service teachers to some degree, and ultimately their desire to implement technology in their own classrooms down the road.  That, for me, is another lesson I will take when I create my own action research project.

 

Howley, A., Wood, L., & Hough, B. (2011). Rural elementary school teachers’ technology integration. Journal of Research in Rural Education, 26, 1-13http://www.mendeley.com/catalog/rural-elementary-school-teachers-technology-integration-3/