Inconclusive research = lame duck

Eynon, R., & Helsper, E. (2011). Adults learning online: digital choice and/or digital exclusion?. New media & society, 13(4), 534-551.

As I continue down the breadcrumb path of research in the general area of my area of inquiry, I’ve decided to open up my research to encompass articles that touch on a range of topics:
adult online learning,
effective practices for online learning,
assessing online learning,
unfacilitated online learning,
facilitated online learning,
adult communities engaged/disengaged in online learning,
various theories and frameworks that relate to online learning,
and, what I’m calling, the catch-all connection to online learning.

If I set a wide enough net, I’m sure to catch onto a guiding research question that makes the most sense for my work and in the mean-time can build up a great arsenal of knowledge in general about what research has been done in online learning and how it was orchestrated.

This particular article would fall under the umbrella of “adult communities engaged/disengaged in online learning”. This study took place in England, Scotland and Wales and aimed to investigate further which adults were engaging in online learning activities. These activities were organized into twelve general areas for fact checking, travel, shopping, entertainment, social networking, e-government civic participation, informal learning and formal learning/training. The researchers wanted to delve further into some of the characteristics that these adults had in common and also what factors influenced them to engage or disengage with different types of online learning. They focuses in on a central divide in the population and really wanted to decipher between voluntary or choice reasons versus involuntary or “digital exclusion” reasons (536).

The researchers methodologies included using the 2007 Oxford Internet Surveys (OxIS) with a sample size of 2350 people. These individuals were selected first by randomly selecting 175 regions in Britain and then from there they randomly selected ten addresses. The participants ranged from 14 years of age and above and they were all administered the OxIS face-to-face.

As the researchers consolidated the results and captured their findings, they divided the disengaged group of respondents into two groups: ex-users and non-users. There were four key reasons as to why these individuals disengaged from not using the internet and they were: costs, interest in using the internet, skills and access. It appeared that ex-users had made the choice to disengage because they were no longer interested in the internet but the non-users highlighted topics of digital exclusion like access and costs.

When it came to investigating why users were disengaging with the internet for learning opportunities, the characteristics of the users did tend to trend but were complex as it often depended on what type of online learning activity was happening. Users that are highly educated, have children over the age of 10 and have high levels of internet self-efficacy were found to be more likely to engage in formal learning activities via the internet. An underlying element that was important for informal learning, was having access to the internet at home. Also, upon analysis of the data set and its trends, began to see pockets of individuals who were “unexpectedly included” or “unexpectedly excluded” (542).

They conclude the research with a statement that this investigation into the engagement and disengagement of users into the internet and online learning is important because it demonstrates that the more information organizations or educational institutions have about a user, the more likely it can provide tailored, differentiated user support to increase the amount of learning activity that takes place.

If I were to turn my critical eye on this article, I unfortunately find more ways in which to improve it rather than strengths. I think one of the greatest strengths of the article is that the content is organized clearly and it really did a thorough job of contextualizing the importance of finding answers for the research questions.

An immediate area of improvement came from the literature review and theoretical framework supports to the research. It truly appears that more time was spent explaining the need for the research than the content and theory that acts as the foundation to the work.

The data collection process in general appeared to be well thought out and good in its randomization of quite a large population sample size. Unfortunately there are a few elements in the methodologies that are missing. Why was the particular OxIS survey tool used and how was it the perfect tool for this research? What questions are on the OxIS? Also, there was no explanation as to the process of how the surveys were conducted beyond “face-to-face” (537). Were they recorded by hand by the participant or the researcher? How many researchers were involved and was their any training needed to maintain consistency amongst the team? And what protocol was used during the survey?

The analysis of the data and the findings were good and very detailed but it almost seemed like the research really didn’t find much. And what was found, goes to support what would seem very sensible. As for the discussion and conclusion, it just seems like the conclusion was that this survey could be done better in the future and next time around they’ll also gather qualitative data, etc. If seems like if that’s your conclusion, then did we find out something important in this study at all? And if it is that there should be more studies in the future, then that’s not much of a conclusion. And since it lacked in conclusiveness, it makes sense that they weren’t able to offer suggestions of what they could do to provide tailored or individualized supports for educational organizations. It also seemed like there was no clear delineation or ability to distinguish clearly between factors of choice and those of digital exclusion.

Personally, I think the researchers have a lot of room for development when it comes to building on this research. I agree with them that a strong next step would be to combine the survey with an interview to capture some qualitative data. I think partnering with one of the informal or formal institutions they identified and surveying/interviewing learners before and after their online learning experience as well as capturing the actions the organization took to: onboard, orient the user to the technology and learning scope, and support their ongoing learning. This data in concert with the other data collected could help paint a more clear picture of who these online learners are, what needs do they have, and is the organization fulfilling those needs.

I think this second round of research could have great potential in providing more access to equitable educational opportunities. If these researchers could really hone in on the factors that exclude learners from online opportunities or even what actions unexpectedly help to include individuals, then that information could be directly utilized by organizations to help support these learners access learning opportunities that otherwise would not “exist” for them.

This article continues to help paint the picture of how much support, thought and detail should go into your writing and research. Learned something important tonight—if you don’t have anything conclusive in your conclusions, then something went wrong!

Which teachers benefit from coaching?

Marzano, R. J., Simms, J. A., Roy, T., Heflebower, T., & Warrick, P. B. (2013).        Coaching classroom     instruction. Bloomington, IN: Marzano Research.

Ross, J. A. (1992). Teacher efficacy and the effect of coaching on student achievement. Canadian Journal of Education, 17(1), 51-65.

The journal article, Teacher Efficacy and the Effects of Coaching on Student Achievement by Ross (1992) illustrates the link between teacher efficacy, instructional coaching and student achievement. The researcher started with the question, “Who benefits from coaching?” (Ross, 1992, p. 62) The study included 18 history teachers in an Ontario School District. The selected teachers had a wide range of experience and demographic factors. The study also followed six coaches that supported the teachers. The identified coaches were highly competent in the area of history and also had a wide range of demographic factors and experience. All eighteen teachers were tasked with implementing a new history curriculum and were provided with three main resources. The resources included the curriculum materials, three half-day workshops and the third resource was contact with the coach. (Ross, 1992, p. 54) The contact with the coach was defined as face-to face or virtual meetings. The minimum requirement for this study was one contact of each between the coach and the history teacher. The district assigned coaches to each teacher based on their physical location. The coaches had their own community of practice to support one another throughout the study.

The study collected data on student outcomes, cognitive skills and coaching.  Student outcomes were measured by a multiple-choice pre and post assessment in the area of history. Teacher efficacy was measured through a self-report from the eighteen teachers. “Subjects used a six-point agree/disagree scale.” (Ross, 1992, p. 55) The researchers collected data on coaching in two different ways, through an interview and a self-administered questionnaire.

Findings from the study indicated a significant increase from pre to post assessment for the student outcome measure. The teachers that had the most contact with their coach had the higher student results. The author also found that the teachers who had higher self-efficacy, had a higher frequency of interactions with the coach and higher student achievement results. Overall the “investigation found that all teachers, regardless of their level of efficacy, were more effective with increased contact with their coaches.” (Ross, 1992, p. 62) One of the surprising findings from this research was that teachers who had the most principal contact had some of the lowest student outcome results.

The discussion portion of the article was a strength. The author revisited the driving research question and how that was answered by the study. In addition, Ross (1992) also shared three hypotheses he had going into the study and explained how they were confirmed or not confirmed through the study.   This was helpful because it gave the reader more insight into the design of the study. The author used this section to connect to other research and show similarities as well as highlighting the uniqueness in this study. It was helpful to have the author make these intentional connections to other studies. As a reader, it allowed you to make sense of how this study fits into the field of coaching. Ross(1992) also used the discussion section of the article to suggest possible future research.

One way to improve this study would be in the area of data collection, specifically the data collection on coaching. The self-administered questionnaire that was collected as data at the end of the study only gave information on the frequency that the history teachers interacted with personnel resources. The questionnaire did not reflect the quality of the coaching interactions or if the interactions had a direct connection to the student outcomes. In addition, the self–administered questionnaire was not only on the coaches that were assigned to them. There were layers of support that they gathered information on; the coach that they were assigned, use of other teachers in school, use of the coaching network and school administrator support. (Ross, 1992, p. 55) The data collection on how frequently the teachers interacted with the administrators and colleagues at their school didn’t seem to align with the driving research question, “ Who benefits from coaching?” (Ross, 1992)

I think the study would have been improved if the researcher collected data not only on the frequency of the interactions with coaches but also the type of interaction and the quality of the interaction with the coach. Ross (1992) explains that the coaching “relationship was less reciprocal in that the coaches were relative ‘experts’ in the history program and there was virtually no classroom observation component.” (p. 54) Thus, coaches’ feedback was given almost entirely on teacher report and through other artifacts such as lesson plans and student work. I believe in the power of in classroom coaching. Marzano and Simms (2013) research on Coaching Classroom Instruction describes how “traditional professional development usually leads to about a 10% implementation rate.” (p.6) The authors went on to reveal that “our experience has shown that when teacher receive an appropriate amount of support for professional learning, more than 90% embrace and implement programs that improve students’ experiences in the classroom.” (Marzano and Simms, 2013, p.6) I believe the appropriate amount of support for professional learning includes assessing what coaching support would be best for the teacher such as in class observation, modeling or team teaching. Therefore, I think it would have been beneficial to also collect data on the type and quality of the interaction.

I believe researching the impact coaches have on teacher effectiveness and student achievement is worthwhile and positively contributes to the field of education. I was an instructional coach for several years and had the opportunity to participate in a national coaching study with the American Productivity and Quality Center with a leading expert in coaching, facilitating the study. The goal of that study was to identify a direct link between the work of instructional coaches in supporting teachers and student achievement. After reading this article and participating in the APQC study, I am interested in continuing to research how access to coaches support teacher effectiveness and student achievement.

“Learning styles” and education in a controlled environment

Pashler, H. et al. (2009). “Learning styles: Concepts and evidence.” Psychological Science in the Public Interest, 9(3), 106-119.

People best learn in different ways.  This is a deceptively simple and interestingly familiar idea in modern educational research and curriculum design.  It’s also a concept accepted—or at least understood—by a wider general public, and fits nicely within the twenty-first century cultural (and technological) context that personalization easily available, expected and best.  But regardless of this wider acceptance, is there quantitative evidence to support the theory?  Pashler et al. (2009) set out to explore the current literature, historical context and quantitative support for what they term “learning styles.”  Through what historical context did this idea germinate?  What experimental methodology would best quantitatively prove its efficacy?  Has such research been performed in the current literature, and if so, what does the evidence prove?

It’s all Jungian

The authors begin by situating the idea of categorizing people into disparate “types”; this, they explain, draws from the work of Jung, whose research in psychology and psychoanalysis led to the creation of behavioral tests—like the Myers-Briggs—that perform much the same function as learning styles.  They categorize people into “supposedly disparate groups” based upon a set of distinct characteristics, which in turn explains something deeper about a person.  Although the authors do not regard these tests as objectively scientific, they do note that these tests have “some eternal and deep appeal” (pp. 107) with the general public.

The authors hold that this “deep appeal” partially explains what draws researchers, educators, learners and parents to the idea of learning styles.  Beyond being a method to feel like a larger and often cumbersome system is treating a learner uniquely, the authors write that learning styles can become straw men for underachievement:

“If a person or person’s child is not succeeding or excelling in school, it may be more comfortable for that person to think the educational system, not the person or the child himself or herself, is responsible” (pp. 108).

Even including the evidence presented, this is an unfair prognostication.  In their desire to explore the objective science of learning styles, the authors have shut down consideration of a slew of externally confounding factors, including socioeconomic stressors, racial background and cultural barriers, which all have a demonstrated influence upon classroom performance (Howard 2003; Liou 2009).  More than that, however, this passage reflects an underlying bias among the authors commentary—that a theory is lesser when it speaks to people emotionally.

What are learning styles really for?

However, when the authors break down the unspoken hypotheses that govern the idea of learning styles, they make an excellent point.  There are two very distinct issues at play:

  1. The idea that if an educator fails to consider the learning styles of his or her student, their instruction will be ineffective (or less effective).  The authors also consider what they term the reverse of this assumption: That “individualizing instruction to the learner’s style can allow people to achieve a better outcome” (pp. 108).
  2. What the authors term the meshing hypothesis, which assumes that students are always best “matched” with instructional methods that reflect their learning style.

These represent both disparate theories of curricular design and widely differing levels of analysis; whereas the first hypothesis presented above represents the assessment of learning styles as critical to the creation of a curriculum, the meshing hypothesis treats learning styles as more of a delivery method.  Most importantly, by confusing these two ideas in exploration of this theory, researchers overlook the possibility that one may prove true while the other does not.

One experimental methodology to rule them all

Before reviewing the current literature, the authors outline abstractly a simple, experimental methodology.  They identify this methodology as the truest way to “provide evidence” of the existence and efficacy of learning styles, and use it as a guideline to measure the quality of data in existing literature.  The requirements are listed below:

  1. Learners must be separated into groups reflective of their learning style; the authors suggest “putative visual learners” and “auditory learners” (pp. 109).
  2. Within their groups, learners are randomly assigned one of two instructional treatments.
  3. All subjects are assessed using the same instrument.

In order to prove the quantitative efficacy of learning styles, the results of this experiment must show a “crossover interaction”: That the most effective instructional method is different for each group.  The authors note that this interaction is visible regardless of mean ability; if Group A scores wildly higher on the final assessment than Group B, a crossover interaction can still be observed.

However, it seems that the authors are confounding their hypotheses in much the same way they identify the literature does; assessing the learning styles of a class and identifying which instructional tools will best speak to a particular learning style are completely different processes.  The latter includes interference from several factors, least of which is the assumption that all instructional methods are equally effective ways to explain the content at hand.  They also do not allow for these hypotheses to be proven independently true; by stating that the only acceptable outcome of this experiment is some magnitude of crossover interaction, they ignore confounding factors—the comparative strength of instructional methods to each other; that all learning styles are equally effective ways to explain the content; that students who identify either an audio or visual strength will respond to the content in the same way—and assume that either both hypotheses are true, or both or false.

But what are the tools for?

In their review of the associated literature, the authors denote only one article that supports the existence of learning styles and uses their outlined experimental method.  They conclude that

“although [this study is suggestive of an interaction of the type we have been looking for, the study has peculiar features that make us view it as providing only tenuous evidence” (pp. 112).

These tenuous features include omitting the mean scores of each group’s final assessment in the paper (instead matching learners with a control); that learner performance was measured by raters; and that the instructional treatments used vary significantly from those “more widely promoted” (pp. 112).

This lack of appropriate evidence, conclude the authors, demonstrates that the theory of learning styles is untested at best and nonexistent at worst.  However, the one point that the authors decline to discuss is why experimental methodology is best for “proving” this theory in the first place.  They assume that a controlled environment will provide truer or cleaner data without recognizing a singular truth of classroom education—there is no controlled environment.  Educators at the classroom level have no control over the previous education and content exposure of their learners; over the influences learners face outside of school; of the gender-based, racial or cultural experiences that shape a learner’s perception.  In such an environment, why would it matter to educators that one mode of assessing learning styles, or one instructional intervention, is statistically better than another?  That environment is far removed from the situation this theory is designed to elucidate.

The authors are unresponsive to their own biases, namely bridging the distance between an idea in theory and in practice.  They make the claim in their introduction that because learning styles are so untested, meager educational resources should not be focused on studying or including them in instructional design (pp. 105).  However, they fail to consider learning styles on a concrete level.  Is it truly more expensive to personalize a curriculum based on learning styles?  Does learner benefit need to be statistically significant in a controlled environment for it to be “worth” the effort?  Although the authors are in some ways critically reflexive of the unspoken hypotheses researchers assume in discussing learning styles, they are unaware of how their personal biases have shaded their commentary, which begs the question: To whom are the authors speaking?

Sources

Howard, T.C. (2003).  Culturally relevant pedagogy: Ingredients for critical teacher reflections. Theory into Practice, 42(3), 195-202.

Liou, D.D., Antrop-Gonzalez, R.A. & Cooper, R. (2009). Unveiling the promise of community cultural wealth to sustaining Latina/o students’ college-going networks. Educational Series, 45, 534-555.

 

Emotional Intelligence Competencies Can Be Developed

Pool, Lorraine Dacre, and Pamela Qualter. “Improving Emotional Intelligence and Emotional Self-efficacy through a Teaching Intervention for University Students.” Learning and Individual Differences 22.3 (2012): 306-12. Web.

 Many researchers argue that emotional intelligence plays a significant role in our attitudes, health, well-being, and professional success (P. 306). If this is true, why don’t k-12 schools and colleges create and implement curriculums that support the development of these skills? “As undergraduate students are gaining qualifications, knowledge and skills to prepare them for future lives in the world of work, it would make sense to ensure they are also equipped with knowledge and skills in relation to emotional functioning and with the confidence to enable them to act on these abilities” (P. 306) This study investigates whether it is possible to improve levels of emotional intelligence and emotional self-efficacy in university students through a teaching intervention (P. 307).

Pool et al. hypothesize that “both ability EI and ESE appear to be important predictors of academic success and graduate employability; theoretically, it should also be possible to improve them” (P. 307). Therefore, they designed a study to investigate whether or not it’s possible to improve levels of emotional intelligence and emotional self-efficacy. Using university students, the authors studied the impact that an eleven-week intervention class had on the participants’ EI and ESE competency levels.

The organization of the article is clear, coherent and logical. The article begins with an introduction, which is broken down into subsections. The subsections include the following headings: the importance of EI and ESE (laying the foundation for the importance of emotional intelligence and emotional self-efficacy), Designing EI/ESE teaching interventions (describing the interventions and assessments), and the present study (explaining who the participants were). Following the introduction was the methods section. Again, the authors broke it down into subsections: Participants, Measures, EI Intervention, and Procedure. Throughout this section, the authors provided detailed descriptions of the study. Next, the authors included the findings and discussions section. In the discussion section, the authors reflected on what they learned, acknowledging that there were some limitations of the study.

Prior to conducting this study, there was very little research regarding the ability to improve in EI and ESE. One study that the authors had investigated did not result in any improvements for the participants (this study consisted of a four-week intervention). Therefore, Pool et al. designed their intervention to take place over 11 weeks and found that the intervention resulted in significant participant growth in EI and ESE. The results implicate that people needed a longer period of learning and reflection in order to develop their emotional understanding abilities. These findings should have significant implications to our k-12 schools and universities and what we value as curriculums.

It was evident that the authors performed extensive research on EI and ESE as well as investigating the studies that had already been conducted. During the introduction and throughout the article, they included research for every variable within their study. When designing this particular study, the authors built on the work Nelis et al. (2009).

They began by designing the intervention (based on the Salovey and Mayer Four Branch Model of ability EI) for the study and identifying the pre and post assessments (EI (MSCEIT) and ESE (the Emotional Self Efficacy Scale). Their study included a larger sample size than the study conducted by Nelis et al. Additionally, they included both males and females from diverse academic concentrations. The study also included a control group.

The intervention class was offered to all students as an elective. The class was two hours per week and was eleven weeks in length. “Students completed the MSCEIT and ESES during the first class and were given a report and detailed one-to-one feedback of their results. They were asked to reflect on their results and incorporate these reflections in their first journal entry. The tests were repeated in the final class” (P. 308).

Throughout the class, the teachers implemented various activities including, “mini-lectures, video clips, case studies, group tasks and discussions, role play and an off-campus visit to an art gallery” (P.308). Students were asked to keep a reflective journal as well as respond to essay prompts and case studies.

The findings were positive. After the 11 weeks, participants showed growth in emotional self-efficacy and some aspects of emotional intelligence ability. When measured against the control group as well as their pre-assessments, the intervention group showed significant improvement.

While presenting the findings, the authors noted several limitations of the study. They stated, “one limitation of this study is the reliance on data gathered from a single source, the participants themselves. The use of multiple source methods, possibly including peer ratings of EI pre and post- intervention, would engender greater confidence in the findings” (P. 311). Another limitation included the teachers/tutors that taught the intervention. Because the teachers/tutors play an instrumental role, their EI and ESE need to be considered when making teacher selections.

“Previous research has suggested that higher levels of ability EI and ESE are desirable for a number of important reasons associated with work-related outcomes, academic achievement and graduate employability, but until now there have been few studies that demonstrate it is possible to increase levels of EI and ESE through teaching or training” (P.306). Through this study, we can conclude that it is possible to improve EI ability. The results of this study also show that it is possible to increase a person’s self-efficacy. These findings have significant implications for how we should be teaching and training our elementary, middle school, high school and college students.

 

References

Nelis, D., Quoidbach, J., Mikolajczak, M., & Hansenne, M. (2009). Increasing emotional intelligence: (How) is it possible? Personality and Individual Differences, 47, 36–41, doi:10.1016/j.paid2009.01.046.

 

Menu: Accelerated Learning – Best with Sides

Hodara, M., & Jaggars, S. S. (2014). An Examination of the Impact of Accelerating Community College Students’ Progression Through Developmental Education. The Journal of Higher Education, 85(2), 246–276. doi:10.1353/jhe.2014.0006

Menu:  Accelerated Learning – Best with sides

Most community colleges are feverishly trying to meet President Obama’s College Completion Challenge of increasing the number of students who complete a degree or another credential by 50% by the year 2020.  The task is large.  Fewer than 30% of community college students graduate within 6 years.  Even fewer who come in testing into below-college level (aka developmental) courses graduate within 6 years (College Completion Challenge Fact Sheet).  Whether you are neo-liberal and want students to contribute to the economy or an old-fashioned liberal and want equality for all people especially those who have been traditionally under-served, you may find this article examining accelerated learning in English and math classes at the City University of New York (CUNY) community colleges a worthy read.

There are several items on the menu of strategies to helping the under-prepared learner progress toward graduation – e.g. accelerated learning, contextualized learning, and problem-based learning.  This article focuses on accelerated learning – an approach in which the developmental sequence of courses an under-prepared student must take is shortened or sometimes offered concurrently with college-level courses.  The authors examined data from the six CUNY community colleges.  Students in the CUNY system are diverse:  “15% of students are Asian, 29% are black, 37% are Latino, and 19% are White; …48% are first-generation college students; and 46% have household incomes under $20,000” (City University of New York 2011 in Hodara & Jaggars 2014).  Overseeing all this diversity is a centralized developmental education testing policy with firm cut off scores at which students are placed in developmental education classes.  Each of the six colleges, though, was more or less free to design their own “menu” or developmental course sequence.

The authors found that the English and Math departments did not tend to consult with their sister departments in the district resulting in varying developmental sequences at each college.  Though to me that seems an oversight of administration, it provided the researchers with a ripe opportunity to compare length of developmental course sequences across the district through analysis of data without having to design an experiment.  In English, the researchers designated the treatment group as two colleges that had a single course of either of six or seven credits and compared those to four colleges with two classes in their sequence.  In Math, the control group was determined to be the five colleges that had three developmental math classes compared to the one college making up the treatment group that had only two developmental math courses.  Data were made available to the researchers over a 10 year period which allowed for some longitudinal following of students out of the community college into the CUNY universities.

The methodology is where things get complicated (and honestly over my head at this very early point in my doctoral program).  The researchers were concerned that just comparing outcomes of students in the short (treatment) vs long (control) term sequences would not account for confounding variables.   They noticed gender, race, and financial aid assistance differences right away and wanted to account for high school performance and students’ academic and professional goals.  By using a couple of logistic regression models, the researchers were able to compare like students to each other e.g. students with similar high school, region of birth, citizenship status, and college major.  This section has much more detail to it and I encourage readers with appropriate expertise to explore it further and those without to trust in the prestige of the Journal of Higher Education to believe the researchers did it right!

In general, students in the accelerated courses had better outcomes in their courses and in subsequent college courses than students in the control groups.  The results were not robust, though, and there was some difference between the English and the math sequences.  Students in the accelerated English sequences were more likely to get to college-level English and to accumulate credits and graduate.  However, students in the math sequence, though they passed college-level math, did not demonstrate long-term college success.  Academic policies within the institutions may contribute to that finding.  The authors report that passing college-level English is required for many other courses allowing those students to continue to make progress toward degree completion while passing college-level math does not necessarily lead to progress in other courses for non-STEM majors.

One aspect of this article’s contribution to the field is its interesting perspective on the role of community colleges in the field of higher education.  The authors suggest that community colleges may actually create barriers against achieving a college education – the opposite of their mission of increasing accessibility to higher education for those who might not have the option to attend university.  Since more first-generation and students of color start at community college that may inadvertently create a class system stratifying the middle and upper class white students into the universities and the students of color and low-income students to community college.

Another contribution is the authors’ acknowledgement in the discussion that the generally modest gains seen for the students in the accelerated classes can likely be improved by “more thoughtfully designed reforms incorporating stronger student supports” leading to “substantial increases in developmental students’ college-level credit accrual and graduation rates” (Hodara & Jaggars 2014).  Also, that as colleges continue to work on meeting the completion challenge with improved graduation rates that collaborative conversations around developmental education are more likely to happen thus building relationships and infusing diverse perspectives to provide a more nutritious meal that includes healthy sides in addition to the main dish of accelerated learning.

The influence this article has on me and my evolving line of inquiry is that, before taking this class, I was considering pursuing an interest in data analysis – partly because I was getting tired of feeling as though I wasn’t making the difference I had hoped to in the classroom.  Though I believe familiarity with data and careful analysis is crucial to effective teaching and effective programs, I find that the the analysis is not quite as tasty to me as the prospect of creating a colorful and nutritious “meal” with a variety of sides that complement the main dish.

 

References

Hodara, M., & Jaggars, S. S. (2014). An Examination of the Impact of Accelerating Community College Students’ Progression Through Developmental Education. The Journal of Higher Education, 85(2), 246–276. doi:10.1353/jhe.2014.0006

The College Completion Fact Sheet. American Association of Community Colleges.  Retrieved from http://www.aacc.nche.edu/About/completionchallenge/Documents/Completion_Toolkit_Complete.pdf

 

Cue the Zoom on Baltimore

Weist, M. D., Stiegler, K., Cox, J., Vaughan, C., Lindsey, M., & Ialongo, N. (2010). The Excellence in School Mental Health Initiative: Final Report (pp. 1–41).

 

Last week I looked at School Mental Health procedures in Australia. I was really jazzed to learn that the sort-of unformed ideas floating around in my head for the last year a) was already fleshed out and b) ACTUALLY EXISTED. And they have for over a decade! But I wanted to do more research about how we are addressing School Mental Health a little closer to home. Preliminary research suggested that Maryland is the place to be for this sort of thing, so that’s where I looked. If we were on YouTube, this is where we’d zoom out of Phoenix and pan over to the East Coast. Cue the Zoom on Baltimore:

In this report, completed by the University of Maryland, Center for School Mental Health, the authors reviewed data collected over two and a half years as two specific schools implemented the Excellence in School Mental Health Initiative (ESMHI). According to the report, “The overall goal of the project was to demonstrate the potential for a full continuum of environmental enhancement, stakeholder involvement and evidence-based mental health promotion and intervention integrated into two schools serving students in grades Kindergarten through 8th grade” (Weist et al., 2010).

I realize reading that might invoke a little of what I experienced earlier this week. Are you thinking, “Um, what? I think I know all of those words, but I when I read that sentence they just kind of glide right under me …” Here’s a picture that might help from (of course!) our friends over at MindMatters in Australia:

The Excellence in School Mental Health Initiative involved programs and staff to support all levels of the triangle:

  • They developed and ran parent groups to improve parent involvement and relationships with teachers
  • They implemented Paths to PAX, a universal/school-wide prevention program
  • They provided professional development to teachers to improve understanding of mental and behavioral health
  • They provided small group interventions for students struggling with behaviors, also known as “early intervention strategies”
  • And mental health clinicians held individual/group therapy sessions with students.

Many schools around the country have parts of this triangle in place, but it is rather rare to see the entire continuum fully supported. The purpose of the project was to find how well all these pieces fit together when they were all in place, including things like the parent involvement. The authors specifically noted that this study could not be used to make causal conclusions (i.e. saying that if your school does these things, it will get the same results), but could be used to make recommendations for the future or for other schools. The authors used descriptive, qualitative and quantitative data to both create and evaluate the initiative. This included a variety of demographic and school data, such as enrollment, attendance, discipline, etc.; surveys and interviews with students, staff, parents, and community members; treatment data collected by mental health providers; case studies; and focus groups.

The authors found that there were many gains, including better parent-school relationships, students receiving more mental health care, and improved teaching strategies in regards to behavior in the classrooms. They also found that while there are many reasons  to give mental health care within the school setting, there are a lot of things that can get in the way. Things like high teacher turnover rates, teachers being overwhelmed with everything else they are expected to do, and variable support from administration all impact the implementation and effectiveness of such programs. BUT!!! They did find that having significant supports – like great funding (described below), buy-in from higher levels, and University support – made it possible to face challenges head-on and overcome many barriers.

I was simultaneously disappointed and pleased with the results. This initiative had funding from a lot of different places, including the City of Baltimore, the University of Maryland, the Baltimore School District, and several public and private organizations. It seems like it would be a dream! But even with all that they ran into many of the same problems I have experienced in trying to carry out different layers of the triangle above. For example, I’ve been at a few schools that buy new programs to address the widest level of the triangle (Whole School Environment). Everyone is gung-ho for the first few weeks, but implementation falls off after a month or two. I would have thought that with so many resources and support staff promoting this initiative for 2.5 years, there would have been more buy-in and compliance from staff. But in reality, they dealt with the same problems I have seen, and for the same reasons: too many other things to focus on, too overwhelmed, high teacher mobility, lack of administrative support, and the program not meeting perceived expectations. While this was disappointing, it was also refreshing to know that simply implementing a new program or jumping on the next bandwagon of a particular intervention is not going to change the school culture. To really make a lasting impact on the school culture, it needs to happen slowly and over a long time.

I did have more criticisms of this report than I did last week’s. For the most part, the report was well-written and easy to follow. They used language that was easy for someone in the educational field to understand, and they gave so much data to support their conclusions. The difficulty was that there was So. Much. Data! And it was all in paragraph form, which meant it was nearly impossible to really get a good handle on it. They were using data from 2 schools and gave in-depth analysis of each type of data from each school. If it had been presented with visuals, like graphs and charts, it would have been so much easier to grasp. Throughout the report they did reference graphs and charts in the appendices… but there weren’t actually any appendices at the end. And the Appendices link provided in the Table of Contents was no longer active. I think I would have been better able to make connections to my own school if I could have seen the data differently.

As I am collecting these articles and reports, I am building this dream world in my head. I want these things in Phoenix, in Arizona. I want to be a part of building them, of making them actually happen. I want to see students in their classrooms more because they’re getting in trouble less. I want to see students that have a better quality of life because they understand what to do in the classroom or in social situations and they have the skills to do it. I want to see teachers who are less depressed and stressed out. I want to be in classrooms where teachers are able to focus on the things that made them want to be a teacher, not all the extraneous junk that keeps getting piled on their plates. (OK, mental health initiatives probably won’t actually affect that, but hey – it’s my dream world, I can make it look however I want!)

I really do want to see some of these initiatives in play, though, to see what they look like when they’re actually happening. Does it look, feel, and sound like any other school? Are culture changes only noticeable if you’re an insider, privy to all the inner-workings of a school? Or is it tangible? Noticeable to everyone who walks in? Do students and teachers enjoy being there because of the positive atmosphere? Or is it still a school, where kids complain about homework and teachers count down the days to summer break? I don’t have the answer yet, but I am doing what I can to find out!

Weist, M. D., Stiegler, K., Cox, J., Vaughan, C., Lindsey, M., & Ialongo, N. (2010). The Excellence in School Mental Health Initiative: Final Report (pp. 1–41).

Facebook as Professional Development?

Rutherford, C. (2010). Facebook as a source of informal teacher professional development. In Education. Retrieved from http://ineducation.ca/index.php/ineducation/article/view/76/512.

 

For professional development (PD) to be considered effective, it must meet four criteria. These criteria characterize PD as: 1) Sustained, on-going, and intensive; 2) Practical and directly related to local classroom practice and student learning; 3) Collaborative and involving the sharing of knowledge, and; 4) Participant driven and constructivist in nature (Rutherford, 2010, p.62). In the journal article, Facebook as a Source of Informal Teacher Professional Development, author Camille Rutherford seeks to ascertain whether discussions that happen between teachers and other educational professionals on social media can be considered professional development and if such informal conversations meet the above four criteria for effective PD.

Rutherford (2010) begins her article by giving a historical context as to the seven different categories that form the knowledge base for teaching; such categorization serves to, “simplify the otherwise outrageously complex activity of teaching” (Rutherford, 2010, p. 61). These seven categories are not meant to be taken as a reduction of the teaching profession to a list of criteria, but rather form contextual categories that help synthesize the diverse areas that professional development can be offered. These seven categories, as first defined by Shulman (1987) are: 1) general pedagogical knowledge; 2) curriculum knowledge; 3) pedagogical content knowledge; 4) knowledge of learners and their characteristics; 5) knowledge of educational contexts [e.g. different styles of education], and; 7) knowledge of educational ends, purposes, and values [e.g. historical perspectives] (Shulman, 1987, p.7).

In order to determine whether teachers’ conversations on social media met the criteria to be considered effective PD, Rutherford monitored the postings on a Facebook group for teachers in Ontario, Canada. She cites that Facebook has the perception of being an, “adolescent playground ripe with juvenile gossip and social bullying,” however, despite this reality, she notes that Facebook has become a space for professionals who seek opportunities to network and exchange ideas and resources to gather (Rutherford, 2010).  In her monitoring of the Ontario Teachers – Resource and Idea Sharing group, which, at the time (2010), had more than 8,000 members, she used both qualitative and quantitative examinations of the discussion topics.

Over the course of the 2007-08 school year, she found that 278 new and unique topics of discussion were created, generating 1,867 posts from 384 different Facebook users (Rutherford, 2010). Any post that didn’t garner more than 2 responses, were excluded from the study, as, without another’s input, it cannot be considered a discussion. Any post that was also deemed too sales-y, or geared toward promoting an item, product, or service was also excluded from the study. Next, two independent “coders” went through the posts and categorized them into one (or more) of the seven different categories of teacher knowledge (see above). The coders then eliminated any posts that were considered too sales-y, or geared toward promoting an item, product, or service for fee (Rutherford, 2010).

The study found that the majority of the posts were related to Pedagogical Content Knowledge (strategies, tips, and tricks to help out in the classroom), representing just more than a quarter of all total posts (Rutherford, 2010). The next category was a surprising one, as it didn’t fit into any of the categories in Shulman’s conceptual framework for teacher knowledge, so Rutherford created a new category: Employment (opportunities and/or related questions). Posts categorized in this area made up the 22.5% of all posts analyzed (Rutherford, 2010). The final category, representing greater than 10% of total posts (19.8%), included discussions of Curriculum Knowledge. All other categories were comprised of less than 10% of total posts (Rutherford, 2010).

One of the essential features of effective professional development is that it be collaborative, on-going, practical, and participant driven. Rutherford (2010) found that the average number of months that users were actively engaged in discussion was less than 2 months (1.79 months) and the average user made only 4.2 posts during that span. These data suggest that discussions happening on Facebook, while certainly constructivist, collaborative, and participant-driven in nature, were lacking the essential “on-going” feature necessary for effective professional development.

In my situational context, as a professional development provider to schools across the state, we’ve tried to integrate more online components into our professional development portfolio offerings, only to find that teachers generally have not utilized them to the extent we were hoping. I see this evidenced in my own practice as well. When I reflect on my own professional development, both as a teacher, and in my current role as a trainer, I’ve been asked to “continue the conversation” on Edmodo, a social media site similar in platform to Facebook, but dedicated to educators and education. I found the steps of creating a username and password, confirming my email, setting up a profile, requesting access to the page, and waiting to be granted access as very cumbersome steps that did streamline the continuation of learning. In my writing of this blog post, I went back to those pages, only to find that there had only been one post in the 8 months the group had been around.

In my own learning experiences, like my Master’s degree, for example, I found the process of online modules, classes, and activities to be an ineffective medium to facilitate true learning, as the “flow” of a conversation was very unnatural and not conducive to insightful reflections and discussions on practice and pedagogy. While I’m sure that some people may enjoy and find value in the convenience of the online style to meet their varying schedules and time constraints, there is, however, something incredibly valuable for me about having that in-person, face-to-face interactions when learning from and with other people. It becomes much easier, in person, to hear the other person’s tone, read their body language, and ask follow up questions in a meaningful and timely manner, things that are lost through virtual communication. Because of these sentiments, I generally agree with Rutherford’s assessment, when she said, “Facebook teacher groups and similar forms of social media should be seen as an effective supplement [emphasis added] to traditional teacher professional development” (Rutherford, 2010, p.69). The idea that online modules could ever replace in-person professional development is not one I could support, but it certainly has a role to play as a free, low-risk, and convenient medium for teachers to collaborate and learn from one another.

 

Additional works cited:

Shulman, L. S. (1987). Knowledge and teaching: foundations of the new reform. Harvard Educational Review, 57(1), 1-22

Music and Technology

Carruthers, G. (2009). Engaging music and media: Technology as a universal language. Research & Issues in Music Education, 7(1), 1–9. Retrieved from http://www.stthomas.edu/rimeonline/vol7/carruthers.htm

 

This week I read “Engaging Music and Media: Technology as a Universal Language.” (Carruthers, 2009) The article is about the role of music and technology in education and how they might play a role together. The article doesn’t offer new research, but it does synthesize others’ research.

The first discussion is about the roles of music, within education and how they might affect each other. Carruthers states that music often plays a secondary role in education. Meaning, that we don’t teach music as part of our curriculum because music is good, in and of itself, we have music within our curriculum because it supports something else. As a music teacher, I often find myself saying “This directly supports you” to other content teachers. You don’t often hear a math teacher justifying why the kids need to learn math. There is an array of reasons why music is valuable on its own legs. It doesn’t need to be supporting anything else.

After reading the article, I recognized that I had used the same type of reasoning as the supporters of Flores v. Arizona. As discussed in “Keeping up the Good Fight: the said and unsaid in Flores V. Arizona.” The supporters had many reasons why the ELL funding in Arizona should be awarded to the schools. The findings, however, showed the reasons from the supporting side fell under the idea of, ‘you should support this because you’ll get this out of it’ mentality. (Thomas, Risri Aletheiani, Carlson, & Ewbank, 2014)With that being said, great teachers integrate all areas into their content. Students need to see how everything is interrelated. Often times children are taught in compartments: math in math class, science in science class…etc, but our lives do not work this way.

Music has, what Caruthers calls, a division of labor. In music, this is the composer, performer and listener; each has their separate job and people rarely cross over. With the addition of technology, this isn’t necessarily the case. My own children compose music with special applications that do not require them to read music. Anyone with the right software can do all three. I see this as one of the biggest impacts technology has had on music. In the past, if one didn’t read music, composing to share with other was rather difficult. Now with software and media- sharing, this becomes relatively easy.

In order to look at the various ways technology impacts us, Caruthers defines technology as anything “from the wheel” to “a personal computer.” This immediately caught me off guard. Defining what is technology never occurred to me. I simply thought of technology as laptops, computers and electronic devices and any software to go along with it, but after reading how Caruthers is approaching technology, I may have to be more specific in what I’m viewing as technology within my research. The ways technology can have an impact, according to Caruthers, can be broken into four parts, technology that: 1. makes things easier to do than it was before, 2. does things better than before, 3. allows us to do things we couldn’t do before and 4. makes us think differently. Again, I had to consider the future of my research. At what level of impact am I going to be assessing. For instance, making it easier to do things than it was before, such as multiplication practice, may not have as big of an impact on student achievement as something that makes the student think in a different way.

The article was more thought provoking than I expected it to be.  Carruthers was clear from the beginning, he was reviewing previous research and that the paper would not answer many of the questions. The purpose of the paper is to create discussion and it proved to do just that. It caused me to look at the research I’m heading into and the basics of how I will approach it. I am dealing with so many more layers than I had previously thought. Carruthers poses, “It is incumbent upon us as educators not only to evaluate the uses of technology – to extol its virtues and denounce its failings – but also to explore deeply how it encourages or causes us to think differently about the world around us.” In my research, I will have to decide if I’m going to look at the level of technology that creates the deepest learning or do I not even take it into consideration.  Do I continue looking at the impact of music with technology on achievement or solely at the impact technology? If I research the impact of music and technology together, does the depth of learning within the music matter in the research? For instance, composing is a deeper depth of knowledge than identifying notes. How does one take this into consideration?  If my research does show an impact on student achievement, is it necessary or valuable to determine if the act of utilizing technology is creating more engagement or is the technology deepening the students’ understanding? Either one could impact student achievement; is there a way to tell which it is? How do I approach the research in a manner that will include my community and their views? In fact, can I even account for the ways technology and especially music has on the community?

Carruthes said it well, “Many of the benefits of music study, some of which are imbedded in the art form itself, are intended by teachers and curriculum planners while others are not” I suspect, that this is the case in technology as well. Unfortunately, it adds another question for me. How do I consider this in my research?

Overall, the article was well written and professional. It was organized in a logical way and he was very clear that he was presenting theories and that, as a literature review, was creating more questions than could be answered in this one piece. His ideas are insightful and have definitely given me pause. I have a lot to consider as I dive deeper into my research.

 

 

Thomas, M. H., Risri Aletheiani, D., Carlson, D. L., & Ewbank, A. D. (2014). “Keeping up the good fight”: the said and unsaid in Flroes v. Arizona. Policy Futures in Education, 12(2), 242–261. doi:10.2304/pfie.2014.12.2.242

Education Starts in the Community

Grothaus, T., & Cole, R. (2010). Meeting the Challenges Together: School Counselors Collaborating With Students and Families With Low Income Tim Grothaus and Rebekah Cole Old Dominion University. In the article, Meeting the Challenges Together: School Counselors Collaborating with Students and Families with Low Income by Tim Grothaus and Rebekah Cole, examine how school counselors work with members of the community to raise student achievement and provide them more opportunity and availability to resources. In addition, school counselors are exploring ways of informing community members. Grothaus and Cole’s purpose for their study is given right up front; “culturally responsive school counselor advocacy and collaboration with low income students and their families is essential to successfully address the pernicious achievement and access gaps pervasive in U.S. schools” (Grothaus & Cole, 2010,p.3). This article was very well organized.  Ideas were clustered and headings were used throughout to make the argument very easy to follow.   I chose this article for its insights that parallel my topic of inquiry, of “school counselor roles in challenging biases, educating stakeholders, and engaging in advocacy for these students and families” (Grothaus & Cole, 2010, p.12). The theoretical framework for the research is set around “the prevalence of youth from families with low income and the distressing inequities in the educational data associated with family income level merits attention.” (Grothaus & Cole, 2010,p.3) I found the research was vague about how data was collected.  Much of the data used was referenced through other sources. However, it seemed that there was relevant data that was not included that would in my view support their analysis. For example, how many of the youth are first generation immigrants, how new are they to the school and how often do they move?  Answers to these are questions would help to transition the reader to the broad findings such as, “Data indicate that low income students have not been afforded equitable educational experiences”. (Grothaus & Cole, 2010, p.3)   Is this data speaking for all low-income students, and what if one compares data for equitable educational experience with geographic location?  For example, a student who lives near Washington D.C. will have better opportunities and exposure for learning about government than a student from Texas or Arizona.  Likewise, students in Arizona and Texas will have more opportunities on topics such as SB1070 and boarder security.  Are those geographic experience factors calculated as equitable educational experiences?  It seems difficult to accurately support the data that says “students from families with low income often lack the resources and teacher expertise of more affluent schools.” (Grothaus & Cole, 2010,p.3)  Is teacher expertise measured at these low income areas based on student achievement?  Teacher evaluation is a point of controversy that is still being debated.  In the interest of duplicating the findings, I would like to see the article define things like educational experiences and teacher expertise. One example could be where researchers pull teachers from an affluent locale and a low income locale and let them trade for a period of 4 to 5 years, then return with data that shows the disparity in teacher expertise.  The research that shows “low-come families tend to be less involved in their children’s academic lives than middle-class families” (Grothaus & Cole, 2010, p.4) is data is more convincing for the differences in student achievement. “Students who are eligible for free lunches are about two years of learning behind the average student of the same age from non-eligible families”. (Grothaus & Cole, 2010, p.4)  What troubles me is the focus on what I see as supporting details to the core of a bigger problem.  I can’t see how to separate the school from the problems of the rest of the community. The authors offer all of these facts that basically amount to a list of citations strung together and then turn the corner with an example of a school that is beating the odds. However, many schools are “proving that race and poverty are not destiny”. (Grothaus & Cole, 2010, p.6) “One seemingly robust factor in many of the success stories is school and family collaboration”. (Grothaus & Cole, 2010, p.6) This is the data that is of the most interest to my own topic of inquiry. “Research over the last few decades confirms that family involvement in their children’s education enhances the potential for students’ success- specifically with high achievement, increased rates of attendance, fewer disciplinary referrals, better homework completion, more course credits accumulated, and increased likelihood of high school graduation and college attendance.” (Grothaus & Cole, 2010, p.6) The larger community is a combination of business, government, civic services, like police, fire, power, water and sewer, and schools.  All of these elements of the larger community move in and are created to serve the people of the community.  Conversely, the larger community depends on the people of the community to buy and spend their money for its support.  It’s simple economics.  Low income is linked to people with little education working in low paying jobs which translates to longer hours and results in the larger community receives less money. The problem in low-income communities is that there is less money.  The solution is to increase the earning potential of the people.  The implied and obvious choice to quickly increase earning potential is through education.  Therefore, it benefits the community and the people of the community if they invest in ways of supporting education.  “A number of schools and their boards are arriving at the same conclusion- that collaboration is an avenue through which students’ needs may be met and achievement promoted.” (Hands, 2005, p.64) The question that requires further study is how the community and the people in it can support education?  A good starting point is by “identifying goals, defining the focus of the partnerships, and selecting potential community partners” (Hands, 2005, p.67). Grothaus and Cole point out that school and family collaboration needs to also “examine school personnel biases about families with low income and challenging colleagues to change their views and practices.” (2010, p.7) I don’t think focusing on teachers’ biases effectively contributes to establishing collaboration.  “Unnecessarily alienating school personnel through strident advocacy may be less effective than respectfully but firmly challenging biases and building coalitions for change based on shared principles.” (Grothaus & Coles, 2010, p.8) “School – family partnerships benefit schools and families in a variety of ways, including families’ feelings of acceptance into the school community” (Grothaus & Cole, 2010, p.10). Collaboration will build up channels of communication to ensure the school and the community are “empowered and equipped with the resources they need to support their children.” (Grothaus & Cole, 2010, p.10). Just as Grothaus and Cole found in their conclusion, “School counselors can advocate for these partnerships via challenging bias, training school personnel, engaging in outreach to families, conducting research to ascertain effective practices, and promoting the benefits involved in collaborative problem solving and accessing student and family strengths.” (Grothaus & Cole, 2010, p.13)  The solution should start with the community and partner up with the schools in the process. References: Hands, C. (2005). It ’ s Who You Know and What You Know : Process of Creating Partnerships Between Schools and Communities, 63–84.

Barriers to Introducing System Dynamics in K-12 STEM Curriculum

Skaza H., Crippen, K. J., & Carroll, K. R. (2013). Teachers’ barriers to introducing system dynamics in K-12 STEM curriculum. System Dynamics Review, 29(3), 157-169.

Science, technology, engineering, and math (STEM) education is required in order to prepare students for fast-paced 21st century careers but best STEM teaching practices have yet to be fully developed. One technique currently being studied is system dynamic modeling that “provides a valuable means for helping students think about complex problems” (Skaza, Crippen, & Carroll, 2013, p. 158). System dynamics offers a means of thinking and modeling that allows students to begin making connections between variables. If system dynamics modeling gives students greater access to STEM curriculum, I believe we need to discover the barriers of program implementation and actively begin breaking them down.

Skaza, Crippen, and Carroll analyzed current barriers to introducing system dynamics into K-12 STEM curriculum in their 2013 article. The authors analyze three research questions by means of a mixed-method approach. The questions are as follows;

  1. How are teachers currently using system dynamics simulations and stock and flow models that were already a part of their adopted curriculum?
  2. For teachers who are not using the simulations, what barriers persist to their classroom implementation?
  3. What is the level of teachers’ understanding of the system dynamics stock and flow modeling language and how might that be influencing the classroom use of system dynamics tools? (p. 158)

The organization of the article is clear, allowing the reader to easily progress through the study of ‘system dynamics.’ Structurally, the article begins with an introduction, which includes the main research questions addressed in the remaining sections. After the introduction there is a review of related literature, allowing the reader to get a better view of previous findings by other scholars. The literature review contains relevant topics that allow for a broader examination of the research topic.  Next, the authors thoroughly cover the context for the investigation, methods used, results, discussions section, and final remarks and future research. As a whole, the organization of the article is all-inclusive and is very coherent.

Skaza et al. (2013) addressed a concept that has previously been studied by other educational researchers. According to the authors, a “larger base of empirical research is needed” (p. 159) in regards to system dynamics in order to begin fully utilizing them in most K-12 classrooms. Overall, the study found that only 2.8% of the educators completed the curriculum, which is equivalent to two participants. After this discovery, the researchers analyzed the major barriers such as lack of access to the technology, low teacher efficacy, and not enough professional development support. Outcomes for the study will allow for future research to address the major barriers discussed.

Within the article, Skaza et al. (2013) analyze systems thinking and system dynamics modeling as means for giving improved access to STEM curriculum, particularly to minority students. Systems thinking and system dynamics modeling “is consistent with recent calls for educational reform that focuses on active learning strategies, teaching for transfer to new problems, as well as intending for creativity and innovation as key outcomes” (Skaza et al., 2013, p. 157). Thus, this study is relevant to the overall consensus of the United States’ push towards effective STEM education.

In regards to theoretical frameworks, “the theoretical framework for this revision includes system and system models as crosscutting concepts and as a component of Scientific and Engineering Practice” (Skaza et al., 2013, p. 157).  As a whole the authors stay true to the framework making the article cohesive and appropriate.

Within the methods section, the authors discuss the mixed-method approach to data collection that is used in the quest to answer the three research questions. The “research method involved a single-group, mixed-method (quantitative-qualitative) design consisting of two phases: a survey followed by a focus group” (Skaza et al., 2013, p. 160). Participants for this study were selected from 40 high schools and consisted of 160 teachers, while the focus group was made up of four participants. In summary, the survey consisted of 17 questions containing both qualitative and quantitative measures. Also, the focus group contributed valuable support for the survey findings, which could be made stronger by increasing the number of focus group participants.

The researchers analyzed the surveys by looking at both qualitative and quantitative data, while using the focus group information to add depth to the survey findings. If another researcher wanted to replicate the analysis piece of this research, there is adequate information to do so. The analysis section fully describes the steps taken by the researcher and allows for replication due to the specifics of how data was analyzed in both the surveys and focus group. Overall, the researchers determined the number of participants who actually implemented the system dynamics concept into their classroom and if teachers failed to implement, the researchers worked to uncover the barriers to implementation.

As far as the findings are concerned, they are based on a thorough understanding of the data. By this I mean that the researchers analyzed the survey information, gained knowledge, and then used the focus group to either confirm or deny these findings. Also, there were multiple questions within each category on the survey helping gain more accurate information. For example, the survey asked teachers to provide proof of understanding the concepts by means of essay answers. So, if a teacher said that unavailable technology was their barrier yet they were unable to describe a science concept, the researchers could conclude that teacher efficacy is also an issue. The researchers discovered that the major barriers to using system modeling in the classroom is technology, yet the focus group and survey essay answers told a different story of potential teacher efficacy problems. Thus, I believe that the barriers are accurately captured, which can in turn lead to potential new research or action.

As an educator, I have experienced the push towards technology use in the classrooms. I believe that this thrust is necessary and important towards the growth of our students and the necessity to bring students into the 21st century. Our goal is to help students use technology to problem solve and work towards higher understandings but what happens when teachers don’t fully understand how to integrate technology into the classroom? Many educators that I have encountered feel uneasy about technology, thus do not make an effort to use it to enhance the learning environment. With this being said, our first move towards incorporating system dynamics modeling into the classroom, in order to enhance STEM understandings, is ensuring that all of our educators and future educators are technologically competent.

 

 

References

Skaza H., Crippen, K. J., & Carroll, K. R. (2013). Teachers’ barriers to introducing system dynamics in K-12 STEM curriculum. System Dynamics Review, 29(3), 157-169.

Promoting success in online education… but, what is success?

Harrell II, I. L. (2008). Increasing the success of online students. Inquiry, 13(1), 36–44. Retrieved from http://www.vccaedu.org/inquiry/inquiry-spring-2008/1-13-Harrell.html

A concise, if not relatively simplistic piece, “Increasing the success of online students” highlights three components that impact student retention in online or distance education programs (2008).  These are student readiness, orientation, and support.  Harrell notes that online or distance education research also demonstrates the importance of “instructor preparation and support” and “course structure” for online student success, but the author sets these aside for this discussion.  In part because online education programs suffer from very high attrition rates, the author focuses on retention as the primary indicator of online student success.

 

Whereas other studies on online learner success, particularly prior to the extensive penetration of the internet in the distance education domain (Roblyer, Davis, Mills, Marshall, & Pape, 2008), focus on either learner characteristics or the learning environment, Harrell does not make this distinction.  Corroborating this approach, through an extensive research effort culminating in a readiness instrument for [prospective] online learners (the Educational Success Prediction Instrument [V2]),  Roblyer, Davis, Mills, Marshall, & Pape (2008) state that their “findings indicate that a combination of student factors and learning conditions can predict success” of online learners, “though predicting success is much easier than predicting failure” (99).  The orientation of the piece is higher education – the author is an assistant professor and the coordinator for student affairs at J. Sargeant Reynolds Community College, presumably writing from his own context; however the references used and the message is more broadly applicable. While Harrell’s piece is not revelatory, it reinforces certain best practices, espoused by related studies, relevant for online learning program development.

 

“Positive impact on online student success”

When an individual embarks on anything new, preparation for their new environment, expectations, relationships, and skills required is integral to his/her capacity to endure what’s ahead positively and productively.  Harrell recommends assessing student readiness for online learning prior to a student beginning coursework, then using this information to either counsel students against the online option or to build an individualized support strategy for each student, based upon their apparent strengths and weaknesses. An orientation should follow, possibly in the form of entire course (as exemplified by Beyrer (2010) and the Online Student Success online education orientation course).  The author favors online (vs. face-to-face) orientations, to get students navigating the technologies and program expectations in the realm and in ways that “mimic” their educational program immediately, before coursework becomes distracted by the student’s [inevitable] technical struggles.  Student technical support that is as accessible and available as the “anytime, anywhere” coursework is absolutely necessary.  The useful suggestion is made to leverage the skills of student workers and others within and beyond the school community to optimize support in this way (without requiring financial and human resources to which many schools lack access).

 

Enabling students to feel and cultivate their own sense of community and belonging is critically important – to student’s individual achievement and to the success of the program. The author cites studies that have recorded students’ reasons for withdrawal as very often being a sense of isolation, or not feeling a part of something (bigger than themselves).  A community among online students is relevant for facilitating a peer culture with mutual engagement, contributing to the student’s school support system, and creating opportunities for interdisciplinary collaboration and shared “real world” experiences.  Tools to communicate regularly and without pretense, e.g. instant messaging and social networking, and generating online spaces, e.g. “virtual lounges,” for students to connect on topics academic or of personal interest can support the development of communities.  “The more students integrate into the formal and informal social and academic culture of the institution, the more successful they will be” (Harrell, 2008).  In addition to these important features of an online program that supports student success Harrell focuses on, Roblyer et al (2008)emphasize that “initial active involvement in online courses predicts success. That is, students who are active in the first few weeks of the class are more likely to be successful in the course; dropout behavior is most likely to occur in the early weeks of the course” (106).

 

The development of a “sense of community” is different from developing a community of practice.  “Communities of practice [as defined by Etienne Wenger-Trayner] are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly” (http://wenger-trayner.com/theory/). Perhaps the more inclusive (for both the participants and the institution) and ultimately impactful approach is to develop a community of practice among online learners.

 

Peers – in multiage groups spanning grade levels –might organize an action research agenda around a theme or specific research question, as an example constructing empowered communities of practice among online student populations.  They could do this on a semester, annual, or episodic basis, but continual throughout their postsecondary career.  Each student would have a position in the community, defined in part by their experience and budding expertise (or competences as Wenger [2000] discusses this).  The shared research agenda, with each individual engaged in and accountable for some aspect of the process, as well as coordinated action steps to maintain the group’s “alignment” to the co-constructed vision and mission, the students would gain invaluable experiences navigating the worlds in their research purview, collaborating with each other, and working toward a common purpose (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013; Wenger, 2000).  The community of practice would serve students’ development in ways applicable to and that transcend academia – arguably supporting their “success”.  Moreover, the likelihood of their retention would be significantly improved.

 

“Success”=Retention?

Harrell uses “success” and “retention” nearly interchangeably.  Is student success no more than an enrollment number?  Many days, given considerable budget constraints and the overly convoluted ADM calculations process (average daily minimum [ADM], which refers to the compensation charter schools receive per pupil) for online schools in the state of Arizona, retention feels so crucial to institutional “success” (read: viability and sustainability) that it doesn’t seem a stretch to conceptualize student success in the stark terms of attendance vs. withdrawal.  However, the effort and heart involved in establishing a new school is likely not just for the warm bodies and smiling faces (hidden behind various screens).  The purpose is more plausibly to provide a better, alternative, or altogether unique educational opportunity to some subset of students.  Defining success in this narrow way unquestionably narrows the exploratory purview: if the investigator is interested only in conditions and learner characteristics that lend themselves to a student’s staying in or leaving a school, will the data capture include relevant life circumstances (e.g. having a baby, needing to care for an ailing family member, having to prioritize income generation, or an onset of a mental disability)?  In other words, will this highly limited conceptualization of success skew the perspective on online educational program quality?

 

On a personal note, I had a meeting this week with a student who “dropped out” of our brick-and-mortar school in her eleventh grade year, due to a sudden emergence of debilitating expressions of a mental condition.  This would be a “failure” – on the student’s part and on our part – with respect to Harrell’s use of “success”.  However, she returned.  Several months later, she feels, once again, capable of course work.  Success!  (For now.)  A more comprehensive investigation would seek an understanding of: what kept the family connected to our school; why they felt they could trust us during her leave and now upon her return to care for her appropriately; and, what sorts of support they have received from us that kept their family loyal.

Roblyer et al (2008) suggest that “virtual schools … must come to gauge their success not only in terms of numbers of students served and courses offered but also in terms of how much they provide access and support to students most in need of an educational edge (107).”  The intent of this post is not to interrogate the author’s use of “success,” but perhaps that inquiry will emerge in the future. What is most interesting about this examination is what it signifies for program development: the benchmarks for programmatic evaluation and metrics of success are, by necessity, predicated upon the institutional imagining of Success – at the student level and at the organizational level.  When we speak of “excellence” in our contexts and consider an action research program to improve upon some aspect of or to, more generally, strive toward excellence, it is unlikely that retention emerges as the lead indicator.

 

Bautista, Mark A.; Bertrand, Melanie; Morrell, Ernest; Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the Council of Youth Research. Teachers College Record. Retrieved May 30, 2014, from http://www.tcrecord.org.ezproxy1.lib.asu.edu/library/content.asp?contentid=17142

Beyrer, G. M. D. (2010). Online student success: Making a difference. MERLOT Journal of Online Learning and Teaching, 6(1). Retrieved from http://jolt.merlot.org/vol6no1/beyrer_0310.htm

Roblyer, M. D., Davis, L., Mills, S. C., Marshall, J., & Pape, L. (2008). Toward practical procedures for predicting and promoting success in virtual school students. American Journal of Distance Education, 22(2), 90–109. doi:10.1080/08923640802039040

Wenger, E. (2000). Communities of practice and social learning systems. Organization, 7(2), 225–246. doi:10.1177/135050840072002

Wenger-Trayner, E. (n.d.). Communities of practice: a brief introduction. Retrieved June 05, 2014, from http://wenger-trayner.com/theory/

PAR by Proxy: Participatory Action Research, Emphasis on the “Research” (Not So Much on the “Participatory” or the “Action”)

(Ostmeyer, K., & Scarpa, A. (2012). Examining school-based social skills program needs and barriers  for students with high-functioning autism spectrum disorders using participatory action research. Psychology in the Schools, 49(10),p.932-941.doi: 10.1002/pits.21646

Ever since learning about participatory action research (PAR), and particularly the work of the Council of Youth Research at UCLA (check out their work here), I’ve been obsessed with the idea. The term itself–participatory action research–doesn’t sound quite as exciting, as novel, as potentially revolutionary, as it is. For the uninitiated, the goal of PAR is to “develop interventions with the direct input of stakeholders” (Ostmeyer and Scarpa, 2012, p. 932). Although PAR-based studies often are designed according to traditional Western evidence-based, if-it-can’t-be-measured-it-doesn’t-exist values, the goals of PAR can intersect with the goals of indigenous research methodologies, in that the PAR studies “take into account the ideas and perceptions of the population directly affected by the problem” (Ostmeyer and Scarpa, 2012, p. 932), just as advocates for indigenous research methods embrace a form of research where “what is acceptable and not acceptable research is determined and defined from within the community” (Denzin and Lincoln, 2008, p. 6) and “they [indigenous people], not Western scholars, have first access to research findings and control over distribution of knowledge” (Denzin and Lincoln, 2008, p. 6). The Council of Youth Research, for example, empowers “youth of color attending city high schools [to] become lead agents in the process of research” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 2). In this model, the people with the most at stake in these critical questions are actively involved in the process of research–and they are the immediate beneficiaries of its findings.

Man. This is exciting stuff. The thrill of it can’t be stated more succinctly than it was by Dr. Melanie Bertrand, who worked with the Council of Youth Research at UCLA and has written and published about its work, upon her visit to our class last night: “The most amazing thing about [it] is that the students who are most marginalized … have the most wisdom to share” (M. Bertrand, personal communication, June 5, 2014).

If I could include sound effects in this blog, right about now I’d embed that old record-scratch sound effect. Wait. Stop. This is exciting stuff, but think about the world I live and work in. The students I teach at my exclusive, prestigious, expensive suburban private school are not the underserved, marginalized urban youth involved with the Council of Youth Research. What place could PAR or, thinking more broadly, indigenous research methodologies have in my world? But then I come back to Dr. Bertrand’s comment: “the students who are most marginalized have the most wisdom to share.” “Most” is relative. No community is without its center and its fringe, its powerful and its less-so.

Although not an indigenous people by the textbook geographic, ethnic, or historical definition, disabled people have been turned “into an essentialized ‘other’ who is spoken for” (Bautista et al., 2012, p. 5). Therefore, indigenous research methodologies may be a viable way to locate disabled people themselves “at the location where research is conducted and discussions are held [to] serve as a major link between fully understanding the historical vestiges of discrimination and the present day manifestation of that discrimination” (Parker cited in Dunbar, 2008, p. 98). It was not so long ago, after all, that Josef Mengele performed his sadistic “experiments” on people with dwarfism (just one example of a heartbreaking many) or, right here in the United States, “researchers injected cancer cells into 19 old and debilitated patients” (Stobbe, 2011) at a New York Hospital, without the informed consent of the patients themselves, to see if their bodies would reject them . And if anyone wants to argue that that, in addition to being a post-racial America, we’ve moved past discriminating against people with chronic illness or disability, I’ll point out that “less than one-half of individuals aged 21 to 64 with a disability” (Brault, 2012, p. 10) are employed, and those who are working earn less than those without disabilities (Brault, 2012, p. 12). As many people have pointed out before, the depictions of disabled and chronically ill people in the media continue to come from a narrow menu of reductive options: the scary, disfigured villain; the noble, sexless saint; the angry, vengeful victim. But that’s a blog post for another day. What about participatory action research, this private school of mine, and students with disabilities?

Cue the record scratch again: There isn’t a huge population of (physically) disabled students at my school. That’s always struck me as weird. After all, according to the U.S. Census Bureau, 8.4 percent of Americans under the age of 15 have a disability, and 10.2 percent of Americans between the ages of 15 and 24 do (Brault, 2012, p. 5). And yet, a casual conversation with a couple of colleagues who have worked at my school for about 75 years between the two of them yielded only a handful of names of kids with disabilities. As far as I know, in the six years I’ve worked at my school of 700-ish students, there has never been a student who used a wheelchair or other assistive mobility device. There have been, and are, a few students with hearing impairments, vision impairments, and chronic illness (asthma, juvenile arthritis, seizure disorders) but not at all in proportion with the census data. That, too, is a blog post for another day: Is this disparity particular to our school or is it perhaps seen in other independent schools? And why?

What we do have is a big–and, I would say, growing–population of students with attention deficit disorder, attention deficit hyperactivity disorder, autism spectrum disorders, dyslexia, learning disabilities, or other disabilities associated with executive functioning. I’m going on observation, not school data, here but I would guess that In this regard, our school is at least representative of the census data, which, as of 2010 reported that “2.3 million children had difficulty doing regular schoolwork (6.2 percent)” and that “about 692,000 had a learning disability, 1.9 million had Attention Deficit Hyperactivity Disorder (ADHD), and 1.7 million had an intellectual or developmental disability or condition” (Brault, 2012, p.13). These students struggle in school. I might even argue that their struggle is made all the more difficult by the competitive, high-pressure, college-focused, and achievement-driven culture of a school like ours.

There’s Dr. Bertrand again: “The students who are most marginalized have the most wisdom to share.”

Our school is in the midst of a great deal of change. Three years ago, we moved to a block schedule. This year, we’re rolling out the use of Canvas, a learning management system that will digitalize teacher-student communication, assignment submission, collaboration, and assessment and grading. Next year, we will shift all instructional materials and texts to electronic tablets. We are in the midst of a comprehensive curriculum review and redesign, including exploring a new capstone project/experience for our graduating seniors. I have served or currently serve on the committees at the center of these activities, and I can say without reservation that we have been thorough, sincere, and utterly student-focused in these efforts. My colleagues have asked, at every turn, “How will this benefit the students? What do they need?” even when the proposed changes threatened discomfiting change for the teachers themselves. I am proud of my school, my colleagues and myself. And yet: Perhaps are we missing an opportunity to involve the students themselves? Sure, student representatives have visited those committee meetings; the administration is dutiful and sincere in connecting students and soliciting their input. A student served alongside me, the headmaster, parent representatives, and members of the board of trustees on the recently completed strategic plan committee. We ask the students questions, and we listen to their responses. But we haven’t (yet) created a space where the students themselves design the questions, protocols, and experiences of research for themselves, targeting what’s most meaningful to them. And we certainly haven’t specifically sought the students with intellectual and learning disabilities–arguably among the most marginalized students in our school–to share their wisdom. Maybe we should.

So that’s my 1,300-word preamble to the article I cited at the top of this entry. I am so electrified by PAR that I decided to explore the ways that PAR projects have been used with students with disabilities. This article reflects only one such project. In this study, the authors used PAR to examine the degree to which an elementary school was succeeding in imparting social skills to its students with high-functioning autism spectrum disorder (HFASD). The study is predicated on the idea that “one of the central roles of schools is to prepare students for life in the work force or postsecondary education and help produce competent adults” but that “many skills beyond academics are needed to succeed in college and/or the work force, including adaptive social skills” (Ostmeyer and Scarpa, 2012, p. 933). These social skills include “listening to others, following steps, following rules, ignoring distractions, taking turns, asking for help, getting along with others, staying calm, taking responsibility for one’s own behavior, and doing nice things for others” (Ostmeyer and Scarpa, 2012, p. 932). These are crucial skills for making one’s way in the world, and they are central to my own pedagogy. However, children with high-functioning autism spectrum disorder struggle with these very skills, and with disastrous results:  “Although children with HFASD score in the average or above-average range on intelligence measures, 70% to 90% of these children underperform academically in at least one domain, including math, reading, and spelling” (Ostmeyer and Scarpa, 2012, p. 933), suggesting that “social skills play an important role in … academic performance and that social enhancement may positively impact academic skills” (Ostmeyer and Scarpa, 2012, p. 933). The stakes are higher than that for these students: deficits in social skills can lead to low self-esteem, depression, and anxiety (Ostmeyer and Scarpa, 2012, p. 933). Children who haven’t developed these skills “are also more likely to be rejected, teased, and bullied by peers” (Ostmeyer and Scarpa, 2012, p. 933). Which, as anyone can tell you, leads to anxiety and depression, which don’t incline a kid to get out there and get cracking on buffing up those social skills. It’s a particularly vicious cycle.

The authors engaged a process of PAR to “gather information on the need for social skills interventions in schools, potential benefits, and barriers to school-based implementation.” True to the values of PAR, the researchers involved stakeholders–in this case, not students themselves but 14 school staff members (“the school principal, a school psychologist, general and special education teachers, special education aides, and teachers of ‘specials’ (i.e., art, library)”) (Ostmeyer and Scarpa, 2012, p. 935) and two mothers of children with HFASD at an elementary school. I’ll be brief in summarizing the research design, results, and discussion here, because I want to get to the real takeaway, which is the potential for this study as a template for a PAR study at my school.

Research Design:

  • Participants attended either a focus group or an individual meeting, each lasting 60-90 minutes.
  • At the meetings, researchers defined social skills, emphasized their importance, and shared current research about social skills.
  • Participants completed a questionnaire and then participated in guided discussion about how social skills programming could be implemented in their community.
  • Classroom observations were conducted of two male students.
  • Qualitative and quantitative results were compiled and presented to the school stakeholders.
  • A tentative plan for the implementation of a social-skills program was designed.

Results from the Interviews:

  • Participants agreed that social skills were important.
  • School staff participants were wary of programs or interventions that removed students from the classroom.
  • Staff participants were worried about taking time away from core academic subjects.
  • Staff participants were worried about the time needed to train staff.
  • Staff participants urged a model that was inclusive; that is, it didn’t target the students with HFASD but included the whole class.
  • Parent participants indicated that their students might need individualized social-skills instruction.
  • Parent participants worried that teachers would be uncooperative with a new program because of the time crunch for training or other classroom responsibilities.

 Results from the Observations

  • The observed students demonstrated deficiencies in most of the skills listed as important social skills in an earlier part of this discussion (following directions, etc.).
  • Peers of students with HFASD were observed to be patient, understanding, friendly and inclusive but may have inadvertently reinforced some of the disruptive behaviors of the students being observed.

Discussion/Findings:

  • Stakeholders agreed that social-skills training was both wanted and needed in the community.
  • Stakeholders agreed that lack of social skills negatively affected academic performance.
  • Stakeholders believed that educating peers about how to treat their peers with HFASD and educating them about the HFASD characteristics would help the social interaction.
  • Stakeholders worried about time away from core academic instruction.

Although this article gave me some very practical insight about how to design a mixed-methods PAR study (How many participants, how many meetings, of what kind? What methods, what instruments?), I’m left with so many questions at the end: the article does not discuss the specifics of the program tentatively designed by the research participants and presented to school decision-makers, nor does it discuss to what extent the program was implemented. As for the observational element of the study, the article uses a mysterious passive voice (“observations of two male students with HFASD were conducted” [emphasis mine]) (Ostmeyer and Scarpa, 2012, p. 936), suggesting that that component of the research wasn’t so participatory after all, but rather conducted by the “official” researchers. Finally, and most importantly, I’m wondering if a study like this, which includes as participants not the marginalized people themselves but people one step removed from the marginalized people, is really true to the purpose of PAR as I understand it.

To me (a newbie to the subject, I grant), this seems like PAR-lite. This study doesn’t locate the power of research with the people on the low(est) end of the power differential; it didn’t yield any actionable, concrete, findings that could be or were implemented to immediately and in real ways benefit the children with HFASD; and because it doesn’t fulfill the goal/promise of PAR, which is to empower the very people who have been disempowered. Students with HFASD, like students in urban schools, can be “dehumanized, denied agency, and not allowed to speak on schooling conditions from their perspective,” students with HFASD. To address this marginalization with  underserved urban youth, the Council of Youth Research “works to empower students to become agents of change” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 4). The people who stand most to benefit from the research into social-skills programming for children with HFASD are children with HFASD, and here they are being spoken for here by the adults in their lives. This, of course, brings me right back to Aurora Levins-Morales’s Medicine Stories (1998), in which she argues that “the disempowerment we all experienced as children has little outlet. We are taught to obey until our own turn comes, with few opportunities to politicize the experience and critique it” (p. 51). Of course, the parents and staff who participated in this research love the children and want to support and serve them, but they are not, ultimately, the true stakeholders at the center of this line of inquiry. Aurora Levins-Morales’s (1998) again: “The fact that many parents are deeply loving, fair and committed to their children’s well-being does not change the fact that this is largely a matter of luck for the child, that she or he has almost no control over the conditions of daily life” (p. 52). I would imagine that control is further diminished when you’re a kid with HFASD who struggles with social skills.

But PAR is–or has the potential to be–precisely that opportunity for children to politicize and critique their experiences as students! To me, this study is PAR by proxy. I think true PAR depends on that essential tension between the power a person or group has historically held and the power presented by the very act of PAR. It has to challenge, if not invert, the power differential. If you’re doing PAR with people close to the people who stand to benefit most, people related to the people who have been the most silenced, I’m not sure it works. Yes, teachers and parents are to some extent powerless; they can’t make sweeping curricular change on their own without OK from the top–and there are many layers of top on top: the school administration, the school district, the board of education, the state government, the federal government. But I assume that teachers and parents have some voice and some kind of venue–PTAs or staff/faculty interactions–that the children at the heart of this issue do not. I don’t know how old children have to be before they can be involved in meaningful PAR (in fact, a classmate of mine asked this very question last night! Thanks, Jeff!), but I suspect the answer has to do with what you want out of the PAR, as well as who has to “buy in” to the findings to effect change and how those people feel about children. I also suspect that the answer might be “not that old.” I suspect that elementary-aged children can be meaningful participants in PAR.

So back to my school and the idea that’s hatching in my head: What if the students who have those intellectual challenges I listed were recruited to perform PAR at PCDS? What if they worked to design research protocols and to collect data, and then they synthesized and presented their findings to the headmaster, the board of trustees, the faculty body, the parent body, the student senate, the student body? What if their findings had real and immediate impact on the questions we’re asking ourselves now, which include pressing questions about learning management systems, class size and teacher load, technology implementation in the classroom, elective offerings and student choice in curriculum, graduation requirements, capstone projects, college counseling, and community-building, to name only a few? I imagine these students might have some real, heretofore under appreciated wisdom to share that could have immediate impact on the decisions we as a school are making. And I think the very process could serve to empower these students, who I think may be among the most marginalized students in our generally-not-so-marginalized school.

There are some steep-seeming logistical concerns. Here’s a non-comprehensive list:

  • What would be the criteria for inclusion as one of the participants? Would students have to have an official diagnosis, or would self-identification suffice?
  • How could I recruit students in an ethical, sensitive, appropriate way?
  • Would students be reluctant to participate, for fear of “outing” themselves? Would their parents worry about stigma?
  • Would students be inclined to take on additional work of PAR, on top of schoolwork that might already be challenging because of their intellectual or cognitive challenges?
  • Would students be incentivized to participate without a grade? Could they get class credit for participating? Is that ethical?
  • Would my school administration support or embrace such a project?

I don’t now the answers to those questions, but I think they’re worth exploring. After all, The timing seems perfect (as a school, we are in a period of self-reflection, reevaluation, and change) and the student-centered goals of PAR are absolutely consistent with the values and mission of PCDS. But if I did this, it would have to be the students’ voices front and center–not their parents’, not their teachers’, not their doctors’/therapists’/tutors’/coaches’.

No PAR by proxy at PCDS.

References:

Bautista, M.A., Bertrand, M., Morrell, E., Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the council of youth research UCLA, 115(100303), 1–23.

Brault, M.W. (2012). Americans with disabilities: 2010: Household economic studies. Current Population Reports. Washington, D.C.: U.S. Department of Commerce, Economics and Statistics Administration, U.S. Census Bureau. 1-23

Denzin, N., & Lincoln, Y. (2008). Introduction: Critical methods and indigenous inquiry. In Denzin, N., Lincoln, Y., & Smith, L.T. (Eds.) Handbook of critical and indigenous methodologies. 1-20.

Dunbar, C. (2008). Critical race theory and indigenous methodologies. In Denzin, N., Lincoln, Y., & Smith, L.T. (Eds.) Handbook of critical and indigenous methodologies. 85-99.

Levins-Morales, A. (1998). Medicine stories: History, culture and the politics of integrity. Cambridge, MA: South End Press.

Ostmeyer, K., & Scarpa, A. (2012). Examining school-based social skills program needs and barriers  for students with high-functioning autism spectrum disorders using participatory action research. Psychology in the Schools, 49(10),p.932-941.doi: 10.1002/pits.21646

Stobbe, M. (2011, February 27). AP impact: past medical testing on humans revealed. The Washington Post. Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2011/02/27/AR2011022700988.html

 

The Dynamic Process of Kindergarten Transition

Rimm-Kaufman, S. & Pianta, R. (2000). An ecological perspective on the transition to kindergarten: A theoretical framework to guide empirical research. Journal of Applied Developmental Psychology, 21(5), 491-511.

The transition into kindergarten signifies a very import step in the lives of young children and their families.  Although many children in the United States attend various types of preschool programs, the transition into formal schooling is a big step for children that have never had preschool experience as well as for children that have had the opportunity to engage in a preschool program.  Rimm-Kaufman, Pianta (2000), conceptualizes the importance of transition programs or activities in the year prior to kindergarten, and offers an approach to these activities that focuses on an ecological perspective.  This approach included three main areas of focus.  First, a focus on relationships between children and their environment, such as schools, peers, families and neighborhoods (Rimm-Kaufman & Pianta 2000).  Second, measures of school readiness need to take into consideration the effects that these relationships have on the child.  Third, Rimm-Kaufman & Pianta (2000) discuss the importance of examining  on how these relationships changes over time and have an effect on the child and their transition success.

There has always been a research interest in the process of children transitioning from home or preschool into formal schooling, but the popularity of this topic has increased even more in the educational research field in the last 10 years due to the dynamic nature of our current educational system as well as the changing landscape of our family structures.

The expectations for early learners are continuously changing, increasing, and developing as mandates from federal and state policy makers are implemented to try to raise the bar for educators and their students.  Along with demands for higher level of academic performance, kindergarten students also have many social-emotional adjustments to make during this transition year.  Independence from their parents, being alert and attentive for five hours a day in school, and transitioning from mostly parent – child relationships, to forming and maintaining relationships with their peers are all significant social-emotional adjustments (Rimm-Kayfman, Pianta, 2000.

Other factors that promote the popularity of research in this area of education are the increased number of children between the ages of 4-7 in our country. The United States has shown a two-fold increase in the population of preschool age children from 1973-1993.  Changes in family dynamics are also factors that warrant research in this area.  There are many more families now than a decade ago that have single parent households or both parents working when they have small children.  Also, there is growing population of children that are subject to the consequences of welfare reform and are experiencing more stressful home lives (Rimm-Kayfman, Pianta, 2000).

With all of these factors taken into consideration, it is clear to see that educational systems need to create educational reform that includes a comprehensive program that takes into consideration all of these risk factors as preschoolers transition into formal schooling.  The goal of new research would be to help students begin their kindergarten year with as much support as possible given their family dynamics and experiences prior to kindergarten to set them up for success.

The authors noted that with all of these changes, the way this transition process is studied is evolving.  This evolution has everything to do with the increasingly complex family dynamic and other societal factors.

When researchers first began to look at the transition period into kindergarten, they often focused on child characteristics.  In other words, they focused on gender, behavior, ethnicity, etc.  More popular now is the idea that there are far more impacting elements in a child’s life that can have an effect on the success of their transition into kindergarten.  Researchers now are focusing on societal influences, such as programs to help the child transition, such as meet the teacher or hello parties, quality of preschool experiences, and interactions between the parents and the child as well as parents and the teacher (Rimm-Kayfman, Pianta, 2000).  The authors argued that the approach to looking at what determines the success of transitioning into kindergarten is complex and should take on a more ecological approach.     An ecological approach can be best understood as looking at persons, families, cultures, communities, and policies and to identify what the effects are on the child.

All of these factors can help researchers conduct research to better inform policy makers and school districts not only on the importance of preschool to kindergarten transition programs, but also help develop them so that they are created for the specific needs of the community they service.

 

 

 

How do they do it?

I’ve always been intrigued when hearing about Carl Hayden High School’s robotics team. I know nothing definitive about them, and I feel like I haven’t heard about them for a few years, but if I’m remembering it correctly, this high school from the west side of Phoenix competed with and won national robotics competitions against colleges like MIT.

carl

At the time, I could not understand how this small high school could compete with some of our nation’s most prestigious universities. Learning about communities of practice has made it clear how this happened. My assumption now is that a culture was fostered, started by strong leadership and kept going by team members who cared about the individual members of the team but cared also for the community as a whole and most likely fought hard to keep that culture going after they graduated.

I saw that too in my high school, Brophy, with one of our recent grads. A few years ago, a member of our robotics team set out to bring robotics down to the grammar school level. Our wonderful student Gabe started a team with our adjacent middle school Loyola Academy. Loyola Academy is a grades 6-8 school that only students who qualify for free tuition can attend. Brophy is an expensive school that many wealthy parents send their sons to, but we also have ample scholarship students who enrich our campus, attending Brophy for free because of scholarship donations. Loyola Academy, though very rigorous in curriculum, has none of the affluent-type students you’d find at Brophy – Loyola students predominantly come from south Phoenix and many live in Boys Hope, an organization that helps children without parents find a family structure.

brophy robo

Well, our student Gabe started this robotics team at Loyola. Many of the members of the Brophy robotics team were perturbed because he left our team to do so. They were also disappointed because the middle schoolers began to beat the high schoolers in completions. And, by the way, Gabe has since gone on to the aforementioned MIT to study both medicine and robotics.

I will add that the Loyola team hasn’t been the same without came and this gets me back to our readings on communities of practice, especially the Jordan and McDaniel piece on robotics specifically, teams or communities need a strong group of peers to fall back and to strengthen the community when needed. Jordan and McDaniel (2014) wrote, “Learning to participate in engineering practices is one context in which uncertainty is particularly relevant. Engineering is an enterprise in which dealing with uncertainty is a central figure” (p. 4). This is why I see the role of a strong peer group so important to groups like these. I also see the importance of one transcendent peer who is able by personality or sheer force of will to keep the group together. Jordan and McDaniel studied how groups of your peers deal with uncertainty. I see my former student Gabe in this study. Jordan and McDaniel set out to study how peers deal with uncertainty in engineering and robotics. The authors detailed the disparate backgrounds of the students in this robotics group and went to discuss the group’s leader. Jordan and McDaniel (2014) wrote, “The teacher of this class, Ms. Billings, had more than 20 years of classroom experience and was recognized on her campus and across the district for her expertise in science and technology instruction” (p. 8). My assumption is that if you look behind the veil of the Carl Hayden High School robotics program, you’ll find a ‘Ms. Billings.” Certainly, in my story, Gabe served that role as well, and I believe that successful groups like these – robotics or any other from band groups or sports teams – will have someone to guide the community of practice through times and situations of uncertainty.

References

Jordan, Michelle E., and McDaniel Jr., Reuben R. (2014). Managing uncertainty during    collaborative problem solving in elementary school teams: the role of peer influence in       robotics engineering activity. The Journal of the Learning Sciences, 00, 1-47.

 

 

 

Developing the Developmental Instructor

Kozeracki, Carol. (2005).  Preparing faculty to meet the needs of developmental students. New Directions for Community Colleges. 2005 (129), 39-49.

My exploration of developmental education continues as I shift my attention toward instructor preparation and development. Higher education faculty, unlike K-12 instructors, are not required to have any specific education certification in order to teach.  All instructors will have strong content knowledge, as that is the emphasis for being hired (whether in a full-time or part-time role) to teach at a college or university.  But, instructors who teach developmental courses recognize that there is a unique skill set to meeting the needs of this diverse student population.  Unfortunately, many developmental educators do not have adequate preparation to meet student needs.  Carol Kozeracki’s “Preparing Faculty to Meet the Needs of Developmental Students” explores strategies to prepare faculty to better serve developmental students.

Article Summary

This article explores English developmental education faculty members’ attitudes towards professional development.  The study includes interviews with 36 English faculty members who teach developmental courses, representing seven community colleges in two states (one East coast and one West coast).  Each community college has large enrollments, exceeding 15,000 students.  The structure of the developmental programs are varied as well – centralized, decentralized, and mixed models (Kozeracki, p. 39).

The study explored three areas or strategies for faculty preparation: graduate programs, internal professional development opportunities, and professional associations.  For each of these areas, the author interviewed developmental English instructors to gain feedback on their attitudes about each area and solicited recommendations on how to strengthen each.

For graduate programs, this study concluded that “there is a significant gap between what is learned in graduate school and what they need to know to facilitate student learning” (p. 48).  Furthermore, graduate programs for English instructors should include additional training on how to teach grammar, how to properly design a lesson, and strategies for both recognizing and working with students with disabilities (p. 48).

Regarding professional development programs, Kozeracki concluded developmental English faculty are most responsive to departmental level programs that provide strategies for meeting  immediate classroom needs and to informal dialogue about teaching and learning practices (p. 48).

Finally, developmental English faculty are least interested in professional associations that provide information and resources that are solely theoretically based; they desire to have practical applications to the developmental classroom (p. 49).

Strengths and Critiques

The audience for this study is most likely faculty development professionals, leaders of centers for teaching excellence, department chairs, and administrators responsible for hiring and developing faculty.  One strength of the article is that it provides tangible suggestions to improve professional development programs for developmental English faculty at a community college.  Specifically, it recommends that “more time should be made at departmental meetings in which faculty discuss pedagogical issues” (p. 46).  The author recommends only using outside speakers who focus on issues of “genuine concern” to the whole faculty (p. 46).  The faculty responders also indicated that more opportunities to engage with faculty outside their respective discipline would be beneficial (p. 46).  The most radical suggestion is for colleges to set aside one to two-hours per week when classes are not offered for faculty to have more focused departmental meetings (p. 46).  This suggestion aligns more to that of the K-12 model with common planning periods.  I appreciated this section of the study as it provided very concrete recommendations based on the responses of the 37 interviewed English faculty members.  The recommendations are relatively low-cost and simple to implement and may have a positive impact.

Despite the recommendations I found useful, I have many critiques of the article which limit its effectiveness.  First, little is shared regarding the demographics of the interviewed faculty members.  I think their length of service to the institution and their own level of preparation and training to teach developmental education are significant pieces of information that would effect responses as to what professional development and training options are beneficial.  Furthermore, faculty attitudes toward graduate school preparation is completely dependent upon each of his/her personal experiences.  Relating this study to myself, my undergraduate degree is in English and my master’s preparation is in Education.  Consequently, all of the recommendations offered in this article regarding graduate programs would not apply in my case, as my graduate experience provided me with the content to be a successful instructor.  Second, I thought the study was very weak in identifying which of the instructor training programs may have an impact on student learning.  I do not think simply stating that developmental English instructors believe workshops taught by peers are more beneficial is enough evidence to justify replicating that type of professional development opportunity.  Yes, the instructors like it.  But, and most significantly, is there evidence to show that participation in the specific professional development opportunity impacted the classroom and student learning in any manner?  The study was very weak in linking the training to classroom modifications and student success.  Finally, I was very disappointed in the author’s initial description of the current state of developmental education. I recognize the article was penned in 2005; however, I was still surprised that the author included a refernce to C.J. Hardin’s work “Access to Higher Education: Who Belongs?”  Kozeracki quotes Hardin’s work detailing six categories for students who require developmental coursework: “poor choosers, adult learners, foreign students, handicapped students, and ‘users’ who lack clear-cut goals and are attending college more for purposes of avoidance than achievement” (Kozeracki, p. 40).   This categorization is appalling, as it implies that having a disability or being foreign or being an adult learner or not having a defined goal are indicators for needing developmental coursework.  Hardin, who I am not familiar with, is obviously not familiar with the community college mission, as those categories describe any number of high-achieving and excellent students within our system.  Inclusion of this categorization made me question Kozeracki’s understanding of the developmental education mission and core principles.

My Take

As an administrator who oversees a center for teaching and learning at a community college, I found this research helpful as it did provide me with tangible, concrete strategies to enhance our professional development programs within GCC and the Maricopa system.  But, further research is needed to identify which of these strategies may best impact instructional practice and student leaning outcomes.  With limited resources and faculty members’ limited time, those activities with high impact and low cost/effort may be ones to implement first.  So, additional study is needed to determine which of the strategies and recommendations would have the greatest impact.

I also think the study should be expanded beyond English developmental instructors only.  One could explore the attitudes and perceptions of math faculty who teach developmental courses. Their preparation and needs may be different, requiring unique strategies to meet their discipline and classroom needs.

Finally, this article made me question our faculty hiring practices in higher education.  For the most part, faculty members are hired based on discipline expertise, while teaching expertise is desired, but not a must.  This practice has to change regarding our developmental student population.  Collectively, as a college community, we need to be more intentional of who we hire and why we are hiring them.  Potential faculty members with degrees in education or with significant teaching preparation should be moved to the front of the pool for consideration.  Faculty positions should be posted with requirements beyond that of content knowledge only, but best teaching practices should be required and then demonstrated through the process.  We owe it to our students to have the best and brightest instructors with both the content and the teaching expertise to help our students to achieve.

Education, Equity, Excellence; Research Blog

Research Publication Blog Post Two

Reference:

Arauz, J.C., (November, 2012).E3 Presents: Education, Equity, Excellence- Three Part Video Series. YouTube, Part 1: What is educational excellence? Retrieved from: http://www.youtube.com/watch?v=ZBEI6ilDv-0

 

Strengths, Contributions and Ways to Improve; Graphic Organizer

Organization: The video was well organized and directed. The narrator did develop the argument and the animation was creative.

Contribution to Field: The video’s contribution to the field was worthwhile and significant.

Literature Review: The video did not provide a literature review.

Theoretical Framework/Lens: The video clearly demonstrated coherence. The research focused on issues our nation faces with its current education system.

Data Collection: Data was collected from inner city schools with a predominately African American and Latino population.

Analysis: The video had a profound impact on current education action research.

Findings: The findings of the video were inconclusive however, the research does outline some assumptions about culturally relevant pedagogy and its meaning for intercultural learning.

Discussion/Conclusions: The video provides a formula for creating successful plans aimed at intercultural learning.

Minor Editorial Comments: No editorial comments for the article.

Miscellaneous: No miscellaneous comments for the article at this time.

 

 

Culturally Relevant Pedagogy: Ingredients for critical teacher reflection

In the video, Education, Equity, Excellence (Part one, 2012) founder Dr. Juan Carlos Arauz discusses the problems and issues our nation faces with its current education system. This video is aimed at responding to the need of culturally relevant pedagogy. In this first video, Dr. Arauz poses the solution to the educational crisis in this country through a 3 part YouTube video series. This blog post will analyze the first video in the three part series. According the foundation’s website, Dr. Juan Carlos Arauz is the founding executive director of E3: Education, Excellence & Equity. E3’s mission is to redefine educational expectations so that every student, regardless of starting point, is engaged and thriving in a school that practices a culture of academic success for all (http://www.e3ed.org/about-e3).

This video was significant to me because it caused me to reflect on ways to create culturally relevant pedagogy. Furthermore, the video was especially significant to my area of study, as it examines how critical teacher involvement is, as it relates to culture and the classroom. The video contains implications that teacher education and creating culturally inclusive schools and classroom environments is relevant. The video challenges educators to examine the impact of cultural resilience. The video offers a very unique look at how a student’s journey through school takes on many life challenges. But making their way through those challenges is exactly the skills employers are looking for. I personally feel that it was very insightful to examine how students in bad situations show resiliency in their everyday struggle. They show this resiliency through:

  • Critical analysis
  • Adaptability
  • Cross-cultural and intercultural communication
  • Collaboration and innovation

This study has caused me to critically examine the relationships between student involvement and the educational achievement gap.

Another dimension of the video was low income and immigrant students need for culturally relevant pedagogy. Specifically, recommendations are offered for teacher preparation and in-service teacher professional development. I learned that educators must reconceptualize the way they teach in order to serve a more diverse student population. The video also gave some very interesting statistics I had not seen before. These statistics serve as a wake-up call for educators and administrators, in the fact that cultural sensitivity and diversity training programs should be a part of every educational program.

This video has a direct correlation to my own experiences. I began my teaching career in a predominantly minority school. As a new teacher, it was very important for me to understand the culture and teach the core and reconceptualize my teaching strategy. I did a lot self-reflection as some things worked and some things did not.

In conclusion, I firmly believe the impact of the three part video series on education research is profound. It opened my eyes to the need for culturally relevant pedagogy. As stated in the conclusion of the video “In order for student to be prepared for 21st century needs, educators must show students how to use their everyday skills so they can proudly stand up and say I am innovative, culturally resilient, adaptive, collaborative, and cross culturally aware.” I believe this statement speaks to how this knowledge can impact not just the teacher but also the student and learning community.

Active Learning in Health Professions

McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A., Mumper, R. J. (2014). The flipped classroom: a course redesign to foster learning and engagement in a health professions school. Academic Medicine : Journal of the Association of American Medical Colleges, 89(2), 236–43.

The article, The Flipped Classroom: A Course Redesign to Foster Learning and Engagement in Health Professions School (McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A. & Mumper, R. J., 2014) is about how the University of North Carolina (UNC) Eshelman School of Pharmacy redesigned the course, Basic Pharmaceuticals II, using a flipped classroom model. The course redesign was “inspired by a desire to transform the educational experiences of our students and to meet students’ requests for enhanced in-class active learning experiences” (p. 237). The article discussed what changes were implemented in the course redesign; they include, replacing in-class lectures with on-line videos to watch outside of class and then spending valuable class time on active learning exercises. The three main element focal points include, offloaded content (recorded videos, etc.), student centered-learning and appropriate assessments.

In trying to determine if implementing a flipped-classroom model would be effective the researchers obtained approval from the UNC institutional review board in order to administer pre- and post-surveys regarding demographic information, students’ perceptions of active learning activities, preferred curriculum delivery format and engagement. In addition, they collected data on exam scores and additional assessment tools and compared the outcomes of traditional classroom format (class of 2011) to those that participated in the flipped classroom format (class of 2012). The overall findings validated that overall student learning increased after participating in a flipped-classroom format.

Review of Strengths and Contributions

Organization – The organization of this article was well constructed. I particularly valued that the authors compared the traditional lecture and course design to the newly implemented student-centered pedagogy.

Contribution to Field – The authors acknowledge that there have been many significant changes to how healthcare is delivered and discussed the increasingly complex healthcare system, yet state, “little has changed in the way that education is structured and delivered to aspiring health professionals” (p. 236). This articles contributes to the field that by incorporating active learning into a classroom, it can enhance learning, improve outcomes, and fully equip students to address 21st-century health care needs.

Literature Review – In my review of the article what I found most interesting is what is happening in traditional classrooms. Some of those happenings included, “students’ attention declines substantially and steadily after the first 10 minutes of class and that the average attention span of a medical student is 15 to 20 minutes at the beginning of class. Although students’ attention returns in the last few minutes of class, they remember only 20% of the material presented during that time. Furthermore, passive learning in hour-long lectures often bores students and can deprive them of rich educational experiences” (McLaughlin et al., 2014, p. 236).

Analysis/Finding – The authors compared pre- and post-course survey responses, course evaluations responses and final exam scores between the traditional and flipped-classroom cohorts. The finding conclude that the students in the flipped classroom evaluated the overall class better in areas such as, comprehension of material, engagement during class, preparedness, etc.

Discussion/Conclusions – The authors discussed in detail how the course was being redesigned but more importantly, in my opinion, honestly discussed the time commitment on both the instructor and the teaching assistants (TA). While the initial time commitment by the faculty is significant, it will decrease in subsequent years, however for the TA the time commitment will remain static. By showing the time commitment implications, I feel that future teachers will feel motivated to incorporate active learning techniques and feel confident that in subsequent classes or years they will not need to devote so much time on planning for the same class material.

Miscellaneous – What I particularly found valuable was some of the next steps and changes that will be implemented for the spring 2013 class. Some of those changes include: no longer considering the textbook to be required reading, replacing the student presentations and discussion with a new 30-minute active learning exercise, and creating “an online 411 Pharmacopedia to be used as an information portal for expanding concepts, new technologies, breaking news, current clinical trials, new drug products, and Web links” (McLaughlin et al., 2014, p. 242). This showed that the authors were incorporating ways to improve the course.

Response

In my blog post from last week, I reviewed the article Does Active Learning Work? A Review of the Research (Prince, 2004). While I am in the infant stages of researching IF and HOW active learning works, I happily find myself being drawn into wanting more information. Some of my curiosity revolves around how students balance their in-class time with their out of class responsibilities and what are the long-term material retention statistics for those who participate in an active learning setting versus a traditional lecture classroom setting.

I am interested in implementing more active learning sessions for a course that I co-direct for fourth year medical students. During their final year of medical school, the students are in their elective rotations locally and across the country. In the spring, prior to graduation, we bring them back for a two-week course that is designed to help prepare their transition into residency. There are some active learning sessions during these two weeks, but approximately 80% of the course sessions are lecture based. In working with the director of the course, we are trying to develop sessions that involve more student involvement and particularly enhance ways to assess their clinical skills. I feel this article (McLaughlin et al., 2014) and the study described within can help persuade administration to allow us to achieve our goals of designing more active learning sessions and move away from the traditional lecture-based sessions.

References

McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A, Griffin, L. M., Esserman, Denise A., Mumper, R. J. (2014). The flipped classroom: a course redesign to foster learning and engagement in a health professions school. Academic Medicine : Journal of the Association of American Medical Colleges, 89(2), 236–43.

Prince, M. (2004). Does active learning work ? A review of the research. Journal of Engineering Education, 93(July), 223–231.

Making the Transition: High School to College

Reference

Venezia, A., & Jaeger, L. (2013, Spring). Transitions from High School to College. Future of

     Children23(1), 117-136.

In continuing my review of scholarly writings this week I found the article Transition from High School to College that provided information that directly relates to my line of inquiry. The title from the article led the way as a direct intro to the subject matter that was being given by the authors. The focus of this piece was to provide research and insight on the current trends of providing interventions to improve access into higher education for high school students in the United States. Venezia and Jaeger (2013) presented their research and information by looking at the state of college readiness among high school students, the effectiveness of programs in place to help them transition to college, and efforts to improve those transitions.

In looking at the formulation of the research presented in this reading, it was very easy to see from the onset that the authors were first focused on using more quantitative data to support their findings. The researchers looked at statistical data from various sources including the National Center for Educational Statistics, U.S. Department of Education, College Board “SAT Report”, and several others. I felt the research was conducted from a more analytical approach than anything. Although the text provides some excellent support to the author’s positions, it seemed to be a report more motivated to impact policy makers. The study showed very little humanizing elements, telling me it was a more data driven approach with research methodologies used for this report.

The information presented to readers was turned to help one understand first and foremost, that there is a problem in the U.S. with students being ill prepared for college entrance. The report also lightly approached the idea that social inequity continues to hamper access to college in underserved communities in the U.S. The paper leaves readers contemplating the effectiveness of current measurement tools for college readiness, because it is something that history shows challenging to track. The report also helps readers to understand that college transition challenges for high school students is recognized at the national level. Hence, there is state and federal funding currently being used for success programs like TRIO, Early College, Gear UP, and Upward Bound programs. The article also reflects on common core standards, the push at the national level for college preparedness, and also presents readers with the idea that there is not one particular fix to guiding students in college and career readiness. The end findings of this article can be summed up in one of the author’s final statements.  According to Venezia and Jaeger (2013) “While great variation in approaches and implementation strategies will no doubt continue, the field would benefit from a more comprehensive and consistent method for learning what works across different types of reforms—for example, using similar definitions and metrics—to help clarify what is transportable, effectively, across different contexts and scaling needs” (p.132).

As a reader, the authors of Transition from High School to College did an excellent job of initially capturing my attention by presenting their stance and position on what the readings was going to present. The piece itself was very coherent from start to finish, and all topics were placed in a safe fashion, to help the reader understand both problem, potential solutions, and end findings. The data was structured into this writing nicely to help support the authors points and to help users continue to build on understanding the issues of high school to college transitions in our country. One of the strongest points made in this article came as the development of the argument was laid out. As a reader, I could feel that there was not going to be any healthy final recommendations to solve the issues being presented.

In reflecting on this reading, I think it may contribute to my research and field of inquiry. Was this article worthwhile? I would say yes to an extent. I will keep it in my archive for reference points that I felt were very sound. I will say the strength of the argument was not supported as well as I thought it could have been, because of the lack of connecting the reader with the human side of the challenges being presented. I know not all scholarly writings are intended to appeal to a reader’s emotion, but if the authors could have provided a more human element, this reading would appeal to me even more.

Some of the key items in this article were highlighted in the way the authors framed their argument and presented their story while supporting it with data. As I read through the material, the author helped me to see some of the challenges that are faced in measuring the topic at hand. Another part to this writing that impressed me was the author’s last take on the analysis presented. In addition to directly supporting academic preparation for students, capacity-building efforts need to focus on ensuring that large comprehensive high schools have strong college-going cultures, on providing the necessary professional development for educators to help all students meet college readiness standards (Venezia & Jaeger, 2013), was well stated. Again, not robust findings, but the author helped me understand there is more research needed in this area.

The authors of this paper made good connections with analysis and material being presented. As a doctoral student who is looking at research methods and tools, I defiantly found some of the examples and materials used as potential tools for my future research efforts. I found clarity in the way some of the material was presented, but also questioned some portions to the writing. I learned from this piece that sound data and resources can help the reader better understand the issues you are looking at. But when topics are presented that have little findings to help support current actions being taken, it can call for challenges in trying to transfer your message to the intended audiences.

I selected this article to review because I found some positives and negatives in how it read. My final thoughts were that the use of statistics and data can do an excellent job in helping support an issue you want people to recognize. The authors of this article did an excellent job at using national statistics and measures to help show readers the impact of the issue at hand. In my professional setting working in higher education, I have to work with top level leadership occasionally to present my view or ideas, and to gain support or funding for projects or other needs. The use of data to support my argument seems always to play a vital role in the impact I can have on my audience. The use of solid supporting quantitative data to help support any measure can always help. On the other hand, this study might be able to build on its argument and position by bringing more qualitative research. Story telling focused on the collective impact of the challenges being faced and the results to be had by some of the programs discussed may help this research become even sounder for the future. The result in reflecting on this article; I had some positive takeaways to help me in the review of research practices, but still many questions to ensure I can be a successful academic researcher in time. a successful academic researcher in time.

The Importance of Student Motivation in Short-Term Study Abroad

Allen, H. W. (2010). What shapes short-term study abroad experiences ? A comparative case study of students ’ motives and goals. Journal of Studies in International Education, 14(5), 452–470.

Summary of Study
Having been a participant on a short-term intensive French language study abroad program twice myself, I found Allen’s study to be very relatable.  Allen chose to examine the goals and motivations that shaped two female students’ short-term study abroad experiences, specifically examining their language learning while participants on a 6-week program in Nantes, France.  If there are naysayers out there about the ability of short-term programs to instill long-term cross-cultural benefits in students, it seems that there are even more academics who contest the true language skills students are able to derive from participating on a short-term study abroad program such as this one and the two I completed.  Consider the literature Allen cites, including Davidson (2007), “[Davidson’s research] claimed that for programs of 6 weeks or less, development of linguistic and cultural proficiency is extremely unlikely to occur” (p. 453).  Certainly I do not think that fluency in a language is something to be expected by participating on a program of such a short length, but reflecting on my personal gains, I cannot help but think there are other gains made, such as the acquisition of more colloquial and current vocabulary, as well as gains in self-confidence leading to continued study of the language.  This latter point was something that Allen actually examined in this study as well.

Allen’s findings indicated that the motives surrounding each students’ reason for pursuing study abroad seemed to have the most impact on determining to what extent the students’ language levels developed during the study abroad program.  Allen’s connection to Lompscher’s (1999) characterization of the types of learning motives was very powerful at contextualizing this finding, explaining that “Molly’s [learning motives] were consistent with social learning (i.e., to communicate or cooperate with others) and higher level cognitive motives (i.e.; arising from her intrinsic interest in learning), whereas Rachel’s were consistent with lower-level cognitive motives (i.e., learning with the goal of obtaining a result)” (p.467).  Therefore, it was logical to see that after the program, Molly went on to declare a major in French and was considering moving to France upon graduation, whereas Rachel ended her study of French after she completed her minor in it, something she had cited as a reason for participating in the study abroad program in the first place.

Strengths and Critiques
Though the research questions posed in this study are not terribly unique as language acquisition during study abroad programs has been studied before, I found Allen’s use of an active theory perspective to be quite revealing in the study’s findings, namely attributing importance to students’ own goals and motivations for participating on a study abroad program.  Overall, I find the article to be well organized and developed, leading to very logical conclusions and important discussion for the field (particularly for advisors and faculty leading short-term programs, who are perhaps best positioned to work with students on identifying their goals and motivations) however, the fact that Allen only examined two students on a single program makes drawing any large-scale conclusions rather tough.  Methodology included recruiting eight participants and then hand-selecting two females based on shared characteristics such as GPA, little prior travel experience, and similar levels of French language skill. I presume this allowed for an even ground for comparison, however I wonder if bias might have been present given that this does not seem very randomized.  The study’s main source of data came from participant blog entries and three interviews.  A pre-trip questionnaire and one administered during the program’s final week were also used.  The researcher also explains that she served as the program director for the trip and interacted with participants weekly, “allowing establishment of trust,” however I think that this, too, might have introduced some level of bias when analyzing data.

Relation to Personal Experiences
As I indicated, since I participated on a short-term French language program, it was helpful for me to reflect on my own motivations for pursuing a study abroad program and to think about how much I feel I learned on my program in terms of language acquisition.  For me, participating on the study abroad program was not an option but something that I felt I had to do in order to come closer to my overall goal of one day being fluent in the language.  This seemed similar to Molly’s motivation as Allen indicates, “What led Molly to participate in study abroad was that full immersion was critical for achieving French fluency” (p. 457).  However, like Molly, I fully acknowledged that participating on a short-term program was not going to make me become the fluent speaker that I desired to be.  Rather, I saw it as an important next step in bringing me closer to that goal.  After having 4 years of French study, it was time to take what I learned in the classroom and put it into practice, which was an important step in building my confidence at speaking French.

At the end of the program I felt pleased with what I managed to accomplish on my 4-week program.  Though I was not fluent, the biggest gains for me were that 1) I had connected with French people my age who taught me how young French people really speak (colloquial and slang variants as opposed to the textbook French I had been learning) and 2) I had succeeded in communicating with a variety of French people in multiple aspects of their society, not the least of which was passing two courses taught completely in French at a French institution.  These successes were critical for me in building my self confidence in continuing to speak and learn the language, and I think, helped me to make greater gains in the language at a quicker rate than I was learning in a traditional classroom setting.  I liken this to the fact that, like Molly in Allen’s study, my goals were more in line with wanting to be able to communicate and learn for the sake of learning, rather than merely wanting to complete the program to advance a degree requirement, which is more in line with Rachel’s motivations.

Ideas for My Area of Interest
In Allen’s discussion and conclusion, she raises two excellent points for my area of interest.
First, based on her findings, Allen asks, “How can study abroad curricula accommodate students with varied motives and goals who enact agency in different ways?” (p. 469).  Given that students’ decisions to pursue study abroad opportunities are so varied, how can international educators and faculty directors work together to develop learning outcomes that will speak to these differences?  As Allen discovered in interviews with Rachel, part of the reason that she did not advance as much as she wanted to with her French language skills stemmed from the fact that she was unable to adapt her learning methods to a style outside of the traditional classroom. International educators need to develop ways to identify students who are are not as self-sufficient at adapting to the new learning environments that study abroad incorporates and provide interventions that assist these students accordingly.

Secondly, and this is a topic I have considered for my own research before, Allen discovered that “Blogging can serve as a tool for self-reflection and goal-setting; however, it is evident that blogging without faculty mediation or other intervention is insufficient” (p. 469). Engaging students in meaningful reflective practices before, during, and after their time abroad is something that I believe is necessary to ensure maximum gains in benefits to be derived from participating in study abroad.  However, this reflection must be monitored and leaders of programs must intervene in order to help students make meaning from their experiences and reflections.  One idea I have is the development of an online course module that study abroad participants would engage in during their experience abroad, regardless of what program they were studying on.  This online module would be designed to be a reflection intervention that provides a way for international educators to help students work through the obstacles they might encounter when pursuing their goals and developing their new skills on their programs.

Reference:
Allen, H. W. (2010). What shapes short-term study abroad experiences ? A comparative case study of students ’ motives and goals. Journal of Studies in International Education14(5), 452–470.

Teaching Pre-service Teachers Content Area–easy….Technology–notsomuch

Hubbard, J. D. & Price, G. (2013). Cross-culture and technology integration: Journal of the Research Center for Educational Technology (RECT) 9(1).

 

 

Cross-Culture and Technology Integration: Examining the Impact of a PTACK-focused Collaborative Project on Pre-Service Teachers and Teacher Education Faculty (Hubbard & Price, 2013) is an article that was written about research done with pre-service teachers.  The intent of the research was two-fold.  First, it was to have pre-service teachers create an instructional lesson using the TPACK model as a basis.  Second, it was to determine how those pre-service teachers might incorporate technology into their lessons when they become in-service teachers in the future.

The study was based on research that supported several sub-components of TPACK (Technology, Pedagogy, And Content Knowledge).  The investigators provided research that backs the need for research in five different intersecting categories. After detailing each, it then chose to focus on just two of those for this study; the last two of the five.   Laying out the literature section in this way proved to be one of the weakness of the article.  The most important items should have been listed first and given more weight and research to back up the rationale.  No justification explaining why these two of the five were being targeted was ever explained.

The researchers did go on to describe that they wanted the pre-service teachers create a Learning Activity Type (LAT) project that required the application of inquiry based social studies skills.   They also mandated the use Microsoft Photostory 3.0 for the technology component.  The pedagogical basis was the concept of culturally responsive instruction which corresponded to the social studies content knowledge of multicultural and global perspectives.

The requirement for the pre-service teachers was to interview a foreign-born person then use internet research skills to gain additional information about the country from which that person came.  Then they were tasked with organizing it and using Microsoft Photostory 3.0 to create their final project which would tell the story about their interviewee’s life, culture, and heritage.

There were 83 students who participated in this project all of whom were all juniors at a university and enrolled in a K-6 elementary school program.  These students were assigned to one of four classes consisting of about twenty students each.  Each class had one of four instructors.  The students also were assigned to meet, as a class, periodically in the computer lab.  There was a separate instructor available there whose job it was solely to help with the technology aspect of this assignment.  That person kept a journal regarding his observations of the classes for the research project but was not considered an instructor for purposes of this study.

The quantitative data came from two sources.  Surveys were given to the students and their instructors.  The student surveys had nine questions that described the learning experience.  Two versions of Likert scales were used.  One set on the first five questions and another on the second four.  82 of the 83 surveys returned were usable.  The other set of data came from responses gathered from a survey given to the instructors.  Of the four instructors, one of them was also one of the researchers and that person chose not to complete a survey to minimize the bias of the results.  This would be another example of a weakness of the research as there were only four instructors to begin with and one chose to, rightly, to withhold filling out the survey.  That reduced the results by 25% of what they could have been and with such a small sample size to begin with, that may have had a large impact on the results.  A strength, however, is that the other three surveys were sent out to be evaluated by a different set of technology experts as opposed to the researchers working on this project in order to minimize any conflict.  The instrument had a reliability coefficient using Cronbach’s alpha of .832.  The standard error of measurement was found to be 2.033 (Hubbard & Price, 2013).

The results from the first five questions on the student survey showed that the pre-service teachers reported being pleased with the class.  The responses for the first five questions ranged from 86.6%-95.1% answering fairly or very useful for questions such as the types of hand-outs used in class, the usefulness of the class, etc.  The remaining four questions garnered a more varied response with only 37.9% reporting that they were fairly or very much likely to use Microsoft Photostory 3.0 as a teaching or learning tool.  The results of instructor surveyed demonstrated that although they felt “very satisfied” with the course, they felt an overall lacking of their own comfort with the technology tool being used which kept them in a situation where they were unable to help their students to the extent they would have liked.

In addition to the two surveys, artifacts were collected throughout the research project.  The researchers recorded classes on video, held one-on-one meetings, took notes, and held interviews with pre-service teachers.  The results from this study indicated that the project did not overwhelmingly help pre-service teachers view technology as a necessary component when teaching.  The survey showed that it did help them gain an awareness of technology and content knowledge (i.e. the cultural responsive component).  This survey size is too small to be generalizable.

Although this research was focused on pre-service teachers and I want to create a TPACK action research for my fifth grade classroom, I still found many ways I can apply some of these concepts to own project.  One of the thoughts that was generated from the results of the survey was that the pre-service teachers did not understand why they were required to use Microsoft Photostory 3.0.  They saw it only as a requirement to contend with rather than concept to master.  I can definitely apply those results to my research.  My students may respond better to the technology I use in my study if they understand that it is something to learn in and of itself and not just a meaningless requirement.

I also made a connection between this piece and the article Rural Elementary School Teachers’ Technology Integration (Howley, Wood, & Hough 2011).  That article described how the attitude of the teacher was so vital towards the successful implementation of technology.  Although this differs because the instructors did want this to be a successful experience, their survey results showed that they expressed a discomfort with the tool.  They also conveyed that not knowing how to use the technology caused them to be unable to help their students during their projects.  I wonder how much more successful this entire study might have been if the four instructors had been well-versed in the tools they were requiring their students to use.

The final connection for me is the concept of completion.  Although not a requirement in one sense, the students were not allotted the time to be able to share their projects.  While that may not have been a requirement needed for successful implementation in the authors’ minds, I wonder if it might have made the project more appealing to the students and therefore raised the scores of some of the pre-service teachers to some degree, and ultimately their desire to implement technology in their own classrooms down the road.  That, for me, is another lesson I will take when I create my own action research project.

 

Howley, A., Wood, L., & Hough, B. (2011). Rural elementary school teachers’ technology integration. Journal of Research in Rural Education, 26, 1-13http://www.mendeley.com/catalog/rural-elementary-school-teachers-technology-integration-3/

Impact of Mentoring on Student Retention

Salas, R., Aragon, Aragon, A., Alandejani, J., & Timpson, W.M. (2014). Mentoring Experiences and Latina/o University Student Persistence. Journal of Hispanic Higher Education, 1-14.

Summary:

In the article “Mentoring Experiences and Latina/o University Student Persistence” (Salas, et al), the authors examined the experiences of Latina/o students who participated in a college mentoring program. The study was designed to look at the overall experiences of students who participated in the program, and evaluate to what extent the experience contributed to their academic success and persistence.

Participants:

Participants were chosen from a list of current or former mentors. Out of the initial 30 students that were identified as possible candidates, 17 agreed to participate. There were 9 female and 8 male participants. Two of the 17 reported health and family issues, and chose not to participate. Of the 15 remaining, 12 students were from in-state, and 3 were from out-of-state. All participants were either currently serving, or who had previously served as a mentor.

The study took place at a land grant institution in a mountain west state. The institutions minority make-up was as follows – Ethnic minority for all university (13.6%), Latina/o (6.9%), Asian American (3.1%), African American (2.3%), and Native American (1.5%).

Testing:

Testing consisted interviews, conducted in two rounds with each participant, with a follow up interview 3 to 4 weeks later. The study explored the following questions:

  1. “What meanings did Latina/o students ascribe to their experience in the university mentoring program?”
  2. “How did these students experience their academic program at the university?”
  3. “What effect did participation in the mentoring program have on their persistence?”
  4. “Were there common experiences, stories told, and/or factors that these Latina/o students described as participants in the mentoring program?” (p. 4)

Analysis of the interview was done using an Interpretive Phenomenological Analysis (IPA), which explored individual experiences of the participants and other factors that they identified as contributing to their success. More specifically, the study was to determine, to what extent, Latina/o students were able to transition to college successfully, get involved in leadership opportunities, engage with academic and cultural activities and resources, and persist.

Results:

For the most part, participants consistently reported that their participation in the mentoring program helped them to be successful. Participants were better able to navigate the collegiate experience, increase knowledge and appreciation for other cultures, improve time management and time management skills, build relationships, and learn about the various resources available at the college. There were three main themes that were identified as a result of the interviews, (1) common challenges (i.e., being a first-generation student), (2) culture shock, and (3) financial issues. Some other common themes included:

  • Lack of diversity at the university (47%)
  • Financial and time management issues (88%)
  • Feeling a lack of belonging (94%)
  • Out of state issues (18%)
  • Multicultural / biracial issues (18%)

Almost 41% of the participants indicated that the program provided them “with a sense of family and, community, which encouraged them to do better.” (p.8). A very small percentage of students expressed that they felt college was easy ( 6%).

Other common factors included

  • Feeling overwhelmed as they transitioned to the college environment
  • Concerns regarding campus climate
  • Discrimination / perceived discrimination

One of the participants reported the following experience:

“My overall experiences in the mentoring program were very, very positive. It was great to establish relationships with like-minded people, people who had the same values, people who were often academically focused, people who were also involved on campus…it got to give me some positive role models to look up [to]…” (p.8)

Limitations / Recommendations:

  • Study participants were the mentors. Would the results have been any different had the participants not been the mentors? Were they successful because they were mentors, or were they mentors because they were successful?
  • Limited sample size of 15
  • Sample focused exclusively on Latina/o students
  • How might this research be applied to other populations (i.e., students with disabilities, other ethnic / racial groups)?
  • How might a mentor program benefit low-income students?
  • What were the mentors doing that was so effective?

Application to my own Action Research:

A couple of years ago, we created a program at the ASU Downtown campus in which staff, within Educational Outreach and Student Services, were each assigned a freshman floor at our residence hall, Taylor Place. The goal of the initiative was to develop a meaningful connection / relationship with each student as a way of fostering personal and academic growth, and helping students be successful by connecting them to critical academic support services and resources.

More recently, we have considered a more targeted approach with freshman who have challenges beyond just being first-time freshman. These challenges include being a first-generation, low-income, and/or student with a disability. We are also looking at students that enter the university with a low confidence interval (CI) score.

Over the past two semesters, we have seen some good results and have been able to build meaningful relationships with students that we believe will help students be successful and persist throughout their academic careers. Other than academic success (i.e., grades and whether or not students persist from one year to the next), we do not currently have a more effective way of measuring whether our efforts are impacting students. More specifically, we do not have an effective way to measure which factors are most effective (i.e., 1:1 meetings, encouraging participation in activities and events, connecting students to resources and other services, time and financial management, etc.).

An area which I feel we are lacking in our current approach, and in which I shall explore through action research, is the viability of a freshman mentorship program at ASU. Over the past two semesters, we have seen some success, students are persisting, yet concerns about fully engaging students in a meaningful way remain.

Every student that comes into higher education is unique. They each bring their own values, identities, academic foundation for learning, as well as their own limitations. Mentoring has been shown to effective in bridging the gap. By exploring the viability and effectiveness of a mentoring program at ASU, we will be able to determine not only the general impact, but more specifically, which factors most effectively impact the students we will be focusing on.

References

Salas, R., Aragon, A., Alandejani, J., & Timpson, W.M. (2014). Mentoring Experiences and Latina/o University Student Persistence. Journal of Hispanic Higher Education, 1-14.

Belief and practice

Sandvik, J. M., van Daal, V. H., & Ader, H. J. (2013). Emergent literacy: Preschool teachers’ beliefs and practices. Journal of Early Childhood Literacy, 14(1), 28–52. doi:10.1177/1468798413478026

Summary

The point of this study was to get an idea of what preschool teachers’ beliefs about literacy were and whether or not those beliefs impacted their practice. The authors of this study acknowledge that there is no question that the earlier literacy skills are fostered, the better, as there is research that links strong emergent literacy skills in the preschool years to later success as a reader. Essentially, this study demonstrated that training on a literacy development program for teachers more heavily impacts their beliefs about reading itself rather than instructional practices that foster literacy development. Beliefs and practice of preschool teachers were examined through a survey. Before conducting the survey, the researchers identified instructional practices that foster emergent literacy skills. To decide if these instructional practices were being implemented, teachers were asked about this on their survey. To determine what their opinions were about emergent literacy, they were also asked a series of questions. From the survey data, the researchers were able to determine that specific literacy trainings do impact belief about instructional techniques but do not correspond to changes in practice.

Contribution to the Field

The major contribution to the field of emergent literacy is that though training in emergent literacy programs may positively impact a teacher’s beliefs about certain instructional practices, the training has little to no effect on actual teacher practice.

Literature Review

From the review of literature, the authors discuss how there is disagreement within the early childhood community about the role of literacy. Some of this has to do with misunderstanding about what emergent literacy is. People are so fixated on the word literacy, that they assume that emergent literacy suggests that students directly need to read and write. However, the author’s define emergent literacy as simply processes that foster the ability to read and write successfully later in life. Additionally, the role of literacy in classrooms is challenged by “deep seated beliefs” (Sandvick, van Daal & Ader, 2013, p. 30) held by teachers. Those deep seated beliefs include uncertainty about how literacy should be carried out in preschool. Some preschool teachers do not believe that literacy skills should be promoted in preschool. Another challenging component is that research suggests that there is much ambivalence on the behalf of preschool teachers about what their role is in promoting literacy with their students.

Theoretical Framework/Lens

The theoretical framework was cohesive. First, the authors hypothesized as to why there was disagreement within the preschool educator community about the role of emergent literacy, based on the current research. The authors acknowledged that there is little to no argument about success in early literacy translates into later on reading success.

The lens that this study went through was that though there is understanding about the importance of early literacy, this does not necessarily imply that the instructional practices to support this will be in place.  This means that what teachers believe does not necessarily inform their practice.  In other words, a teacher can believe that it is important to promote early literacy but not have that reflected in their instruction.

Data Collection Methods

In order to find out more about teacher beliefs and practices, the researchers conducted a 130 item survey between two groups of preschool teachers: those who had participated in a literacy training program which promoted practices such as reading aloud and phonological awareness and a group of teachers who had not, the latter serving as a control group. The goal of the survey was to get a sense of teachers’ attitudes and beliefs about emergent literacy practices and what their actual literacy practices in their classrooms were.

Analysis

I thought the method used here was solid. A survey was given to teachers to measure their beliefs about literacy, what they did in practice and whether or not these practices were in line with current research. The only area of concern I see is with this approach is that I feel many respondents might feel inclined to be dishonest about their practices, particularly the ones who went through the literacy training. I cannot help but think if I went through training on literacy practices, I would be inclined to say that I do these practices because I am always seeking for approval, to be the star student, even though it would be made clear that this was going to be anonymous. Granted, the survey showed that beliefs did not change practice, so it seems like I would be wrong in my theory. However, I would love to see further study on what the impact on ‘practice’ or lack thereof is in order to get a firmer response. This might mean that researchers will have to actually go into classrooms during literacy instruction to see if any of those practices that they claim or disclaim are in fact being followed.

Findings

Though preschool teachers had moderately positive beliefs about literacy in preschool, the authors of this study contend that beliefs do not correspond to practice. As Sandvik, van Daal, and Ader (2013) conclude, “with the exception of Shared Reading, preschool teachers reported engaging in all other literacy-related activities (Emerging Reading and Writing, Letter Knowledge, Phonological Awareness, and Literacy in Play), on average, only 0-5 minutes per day on any given literacy-related activity” (p. 46).

Discussion/Conclusions

There are several conclusions to come to from this study:

  • Students need to learn about reading in the emergent literacy phase in order to be prepared to read when they reach school
  • Exploration by the child and adult-directed activities work in conjunction with one another during the preschool years to foster later literacy
  • Children need to engage in storybook reading by interacting with the text through retelling, asking questions, reimagining the text. This makes reading fun.
  • Phonological awareness is as important as storybook reading and can be made fun through games
  • Identifying literacy issues can be done in the preschool years and is encouraged. Interventions will be more effective the earlier they are identified.
  • Further research is needed on the identified literacy skills “can
    best be promoted in preschool” (Sandvik, van Daal, & Ader, 2013, p. 44).

In thinking about how some of our lowest income students might have access to preschool through Head Start, it is absolutely crucial that we are giving teachers the best programs and materials to teach our youngest students how to read.  Having teachers simply believe that literacy is important is not good enough to get our kids literate.  We must equip our teachers with the best resources and train them on how to use them in order to increase access to education through literacy.

Participation Action Research; Uncovering the Ugly Truths

Critical analysis of “Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research”

 

The aim of this blog is to analyze and evaluate the article “Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research” (Bautista et al, 2013), as it relates to this week’s theme; communities of practice within action research and leadership. The article explores how inner city youth of color get involved in the process of action research within their community of practice. The authors define this type of research as youth participatory action research YPAR (Bautista et al, 2013). From the beginning, I started to immediately connect with the scope of the study. As a product of a failing inner city school, I found the research to be insightful and I identified with many of the problems unearthed by the study. The inquiry proved to be of great value to the youth that participated in it, as they were forced to examine their “community of practice”, which was various high schools in and outside of their areas. At first the concept of social injustices in schools seemed to be quite ordinary, however the study revealed a much deeper, uglier side to inequalities in inner city education. What I thought was unique about this was, again the youth themselves were uncovering this information by contrasting their high school environment with other schools.

The field notes taken by the researchers provided in depth data and also generated new information for community members to share experiential knowledge. This rich information will allow for the community members to act as agents of change on issues that impact them directly (p.4). As I became engrossed in the findings of the study, I began to reflect on the article; Unveiling the Promise of Community Cultural Wealth to Sustaining Latina/o Students’ College Going Information Networks (Liou et al, 2009). This article touched on some of the same inner city school inequity issues that the Bautista article did. The concept I focused on in my analysis of both articles was impact. How in the Bautista article (2013), the impact was powerful. Powerful for the youth council researches as they saw the striking differences. At times it almost seemed unfair, like a slap in the face, or a punch to the gut. Inequality screamed out as the researcher describe their tour of Richside High School, with its planetarium, three cafeterias, and brand new science and technology building (p.10). Impact can also be seen and valued in the Liou article (Liou, 2009), as it examined the impact of socioeconomic factors leading to young people of color not having equal access to educational advancements. The impact of the lack of caring school adults such as counselors (p.549). According to the Liou article, studies show that students who have caring supportive adults involved in their lives perform better academically (Liou, 2009).

Both articles take an in-depth look at the inequalities of inner city schools and for the purposes of this blog they shall be considered communities of practice. The Bautista article however, struck a nostalgic and emotional chord with me. As the article described inner city schools with dirty bathrooms, bars on windows, and overcrowded classrooms. I reflected upon my high school environment, the stench of cigarettes and urine in the bathrooms, dirty cafeterias, and desks covered with gang affiliated graffiti. As an inner city youth I didn’t even realize there was anything wrong with my learning environment. I thought all high schools looked and smelled like that. And that is because I had never been to a nicer high school. In my opinion that is what makes this study so impactful. The youth in the YPAR got to experience these disparities first hand.

In conclusion, the examination of not only the Bautista article (2013) but the Liou (2009) article as well, provided me with a shocking and disturbing view at the impact of the inequalities of inner city schools. As shocking and impactful as the image of Arnold Schwartzeneggar hold a knife large enough to carve a buffalo with the caption “we’ve got to give every child in this state equal opportunities, equal education, equal learning materials, equal books, equal everything (p. 16).”

References

Bautista, M., Bertrand, M., Morrell, E., Scorza, D. & Matthews, C. (2013). Participatory Action   Research and City Youth: Methodological Insights From the Council of Youth Research.   Teachers College Record, 115(100303), 1­23.

Liou, D., Antrop­González, R. & Cooper, R. (2009). Unveiling the Promise of Community   Cultural Wealth to Sustaining Latina/o Students’ College­Going Information Networks.   Educational Studies, (45), 534­555.

Toward justice in the social & political act of research

What is constitutive of “evidence” or “research” in one setting may be representative of a highly bounded perspective and methodology.  Predominant approaches to research in the academy and for policy action largely reflect and reinforce status quo power dynamics.  Whole knowledge domains, ways of knowing, and knowledge producers are ignored or are represented from an “outsider’s” purview.  Critical race theory (e.g. Dunbar, 2008), critical indigenist pedagogy (e.g. Denzin & Lincoln, 2008, p. 1-20), participatory action research (e.g. Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013), and a framework based upon community cultural wealth (Liou, Antrop-González, & Cooper, 2009) offer methodologies for “alternative” approaches to research.  These are distinct not only in what is explored, but who defines the scope, leads the investigation, and shares findings (as well as, how these agents generate the study scope, structure the investigation, and present and distribute study outcomes).  To strive broadly toward a more equitable (representative of individual stories, collective narratives, and languages that may reveal pertinent histories and angles) and accessible research program, strong arguments are made for the active engagement of underprivileged or nondominant groups in constructing research agendas, methods, and generating and disseminating new knowledges, particularly as relevant to their own positioning.

 

The dominant paradigm in research is rooted in privileged Western, neoliberal ideological frameworks, which value the essentializable and universalizable.  Data collection is expected to be tidy, even “objective,” which contributes to the distance between the researcher, largely an “outsider,” and the researched / the “object” of study.  Expertise is similarly narrowly defined, even when researchers demonstrate the “best of intentions” attempting to expose or better understand a problem or context of groups of under-privileged, indigenous, peoples of color, and/or peoples characterized by other forms of “difference.”  Methods and discourse are predicated upon the neoliberal imperial agenda, which values that which can be commodified and conceptualized in terms of the marketplace, competition, individualism, and exclusivity (illustrated by the “silo” metaphor in academia).  Even the “English language is positioned as an ideological commodity in a neo-liberal state – English fosters competition, reduces risk, provides insurance and produces entrepreneurial subjects” (Thomas, Risri Aletheiani, Carlson, Ewbank, 2014, p. 243).

 

The active engagement of the marginalized or groups representing “difference” from the mainstream socio-cultural context in research, not only enriches the agenda and associated outcomes, the process can act to transform participants in ways that resist their own point of underprivilege or periphery.  This can have the vital effect of challenging the sociopolitical regime that enables the quietude of groups who face injustices – even from the perspective of the dominant culture’s own expectations of itself.  Learning (an integral aspect of research) may be perceived as a transformative, even radical act; Wenger (2000) provides a social definition of learning demonstrative of its impact beyond the edification of the individual: Learning “is an interplay between social competence and personal experience. It is a dynamic, two-way relationship between people and the social learning system in which they participate. It combines personal transformation with the evolution of social structures” (Wenger, 2000, p. 227).

 

Individuals motivated by an issue may form or become part of “communities of practice,” an opportunity for collaborative, critical exploration, wherein the participants are active agents of localized change and knowledge production.  Participants can develop a critical consciousness about their positionality and the various networks (particularly of informational capital [Liou, Antrop-González, & Cooper, 2009]) that may be available to support them and their agendas.  As “insiders,” researchers may have better access to their context, including to human subjects who may feel more trusting or less-threatened or curious of the researcher (e.g. observer or interviewer) (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013).

 

“What does it really matter?” if we get the data we need to make adequate decisions and to generally understand the problem context, one might say.  Students of the Council for Youth Research endeavored to “find out to what degree California students receive an ‘adequate’ education and whether it meets their academic needs,” a commitment of the state to its constituents.  The team “concluded that education for students in urban areas was inadequate” (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 8 & 11), presumably in part because of a disconnect (a “boundary” [Wenger, 2000]) in knowledge and action (from the local level to the legislative level).  Equity of process and product (expression and dissemination) is significant not only for research’s sake, but because it is the products of research, which may have yielded from a practice undergone with the blinders or biases resultant from the limited researcher perspective, that inform policy making (Thomas, Risri Aletheiani, Carlson, Ewbank, 2014).  Pushing back against the dominant neoliberal norms governing research agendas and practices, includes developing communities of practice with diverse stakeholders (e.g. student and school adults, academicians and indigenous shamans), and utilizing, even foregrounding, culturally relevant artifacts and practices such as storytelling  and performance (Cajete, 2008; Dunbar, 2008), presenting and sharing findings in languages and ways meaningful to all stakeholders, e.g. documentaries and multimedia presentations (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013, p. 8 & 11).  The impact of a more inclusive, representative, critical research program that centralizes points of difference (e.g. race, gender, class) may well be policies more reflective of the needs of nondominant groups.

 

 

Bautista, Mark A.; Bertrand, Melanie; Morrell, Ernest; Scorza, D., & Matthews, C. (2013). Participatory action research and city youth: Methodological insights from the Council of Youth Research. Teachers College Record. Retrieved May 30, 2014, from http://www.tcrecord.org.ezproxy1.lib.asu.edu/library/content.asp?contentid=17142

Cajete, G. (2008). Seven orientations for the development of indigenous science education. In L. T. Denzin, Norman K.; Lincoln, Yvonna S.; Smith (Ed.), Handbook of Critical and Indigenous Methodologies (pp. 487–496). Sage Publications.

Denzin, Norman K.; Lincoln, Y. S. (2008). Introduction: Critical methodologies and indigenous inquiry. In L. T. Denzin, Norman K.; Lincoln, Yvonna S.; Smith (Ed.), Handbook of Critical and Indigenous Methodologies (pp. 1–20). Sage Publications. Retrieved from http://www.sagepub.com/booksProdDesc.nav?prodId=Book227933

Dunbar, C. J. (2008). Critical race theory and Methodology. In L. T. Denzin, Norman K.; Lincoln, Yvonna S.; Smith (Ed.), Handbook of Critical and Indigenous Methodologies (pp. 85–99). Sage Publications. Retrieved from http://www.sagepub.com/booksProdDesc.nav?prodId=Book227933

Liou, D. D., Antrop-González, R., & Cooper, R. (2009). Unveiling the promise of community cultural wealth to sustaining Latina/o students’ college-going information networks. Educational Studies, 45(6), 534–555. doi:10.1080/00131940903311347

Thomas, M. H., Risri Aletheiani, D., Carlson, D. L., & Ewbank, A. D. (2014). “Keeping up the good fight”: the said and unsaid in Flores v. Arizona. Policy Futures in Education, 12(2), 242. doi:10.2304/pfie.2014.12.2.242

Wenger, E. (2000). Communities of practice and social learning systems. Organization, 7(2), 225–246. doi:10.1177/135050840072002

Inspired to take action in action research…

Last week, after reading an article, (Shulman, L., et al, 2006), that thoroughly described the differences between the PhD and the EdD, I was really affirmed in my choice of ASU’s EdD program. The underlying concept of participatory action research of being a line of inquiry that rises out of the need of the local community is something that personally speaks to my personal and professional desires. I joined and continue to work in the field of education because I want to be a force of positive impact that helps those who live in the community I serve.

As we were grappling with elements of scholar and community identities last week, I’ve really begun to consider various aspects of research in general. Who is my community? Am I an insider? Outsider? Or some odd hybrid? Who ultimately is the beneficiary of the research? How do I ensure that they do benefit from the research? If I’m not a “part” of the community, can I even accurately identify what problems exist in the community?

I feel like these questions were not necessarily answered but enhanced from some of this weeks readings. A few texts in particular grabbed my attention and caused me to critically reconsider the above questions.

The first article that really made me reflect on the previous questions was Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research (Bautista, Bertrand, Morrell, Scorza, & Matthews, 2013). This article followed a project of the Council of Youth Research in Los Angeles as they taught high school youth how to do research and then supported them as they altered some traditional tools and practices to fit their needs. The students conducted various branches of research around schooling in their local community and many of the action research participants had personal connections to findings, experiences and systems it illuminated. I think I particularly connected to this article because it seemed that these students immediately benefited from the process and findings of the research. The students walked away from the experience being more informed advocates for equitable educational opportunities in LA.

Another set of text that confounded the questions that I’ve been grappling with, comes from the book Handbook of critical and indigenous methodologies (Denzin, Lincoln & Smith, 2008). A major component of the text is to analyze the methodologies and practices of traditional research and eradicate practices that reinforce colonization practices. A means of doing that, is to truly allow indigenous cultures and communities create their own research agendas, identify their own problems and conduct the research in ways that uphold their values and practices. This speaks to at least two of my main concerns. Research that is conducted in this manner truly benefits the community because it rises out of a need they’ve established. It also addresses the concept of whether or not an outsider of the community can accurately identify a problem. I think this text has caused me to believe that yes, an outsider might be able to identify elements of a problem that plagues the community but they may not ever necessarily identify or establish the importance, ramifications or depth of that problem themselves. I think the text establishes ways in which “outsiders” can assist communities in research but it really is described as completely altruistic and at the mercy of the community.

One last text that I was particularly drawn to was an article entitled, ‘Keeping Up the Good Fight’: the said and unsaid in Flores v. Arizona’, (Thomas, Risri Aletheiani, Carlson, & Ewbank, 2014). This particular article was crafted and organized very well and took a very interesting view and research stance on the Flores v. Arizona case and its implications for English Language Learning students in Arizona. I think one thing that immediately caught my attention and was present throughout the article, was the very objective and distant feel of the text. I think the authors did a profound job of connecting novel concepts to the plight of ELL students and Arizonans as well as crafting very poignant images that help illustrate that plight even more. However, what I didn’t get from this article that I felt from some of the others is a sense of personal connection. I understand that writing articles in a small group may drown out a strong, individual voice and even the article that this text was written for may demand very removed, distanced writing but I couldn’t help feel that this article was written in a fashion of an outsider looking in.

That may very well not be the case, but upon reflecting on the idea of participatory action research and the role that we as community members have in serving the needs of that community, I can’t help be believe that my research should be something that I’m not only passionate about but personally connected to. I hope to see that my writing reflects that element of community member fervor and that it ultimately benefits my community.

Bautista, M. A., Bertrand, M., Morrell, E., Scorza, D., & Matthews, C. (2013). Participatory Action Research and City Youth: Methodological Insights from the Council of Youth Research. Teachers College Record, 115(10).

Denzin, N. K., Lincoln, Y. S., & Smith, L. T. (2008, May 7). Handbook of critical and indigenous methodologies (N. K. Denzin, Y. S. Lincoln, & L. T. Smith). Sage.

Shulman, L. S., Golde, C. M., Bueschel, A. C., & Garabedian, K. J. (2006). Reclaiming education’s doctorates: A critique and a proposal. Educational Researcher, 35(3), 25-32.

Thomas, M. H., Risri Aletheiani, D., Carlson, D. L., & Ewbank, A. D. (2014). ‘Keeping Up the Good Fight’: the said and unsaid in Flores v. Arizona. Policy Futures in Education, 12(2), 242-261.

Critical Reflectivity and Student Agency

This blog article will focus on bridging the work of Bautista et. al (2013) and Liou et. al (2009) with Howard’s (2003) rubric for self-reflection; beyond the ability to recognize your individual biases and agency, it is also important for research and researchers to recognize power built from student experience and the wider community.

Howard (2003) described a very personal rubric to aid educators in reflecting inward: upon their current racial or cultural biases, as well as major (personal) historical influences upon them. Bautisa et. al, expand upon this practice of cultural reflection, but move the focus outward; using a youth participatory research program (YPAR) as an example, the authors situate the power of student experience and student voice in educational research. The authors’ goal was to explore which “traditional tools of research” (pp. 2) students appropriated to evaluate their program—the Council of Youth Research. As part of a wider discussion, they also note the absence of student experience from educational research as a whole.

Liou et. al, likewise, expand upon the theme of critical reflexivity by focusing upon the agency that exists outside of a traditional school. How do local communities empower students to succeed–or, in this case, to seek out relevant resources and materials to apply for college–in the absence of such assistance from an underperforming school? The authors note that often, when services do not exist in underperforming schools–or when those services are not readily available to all students–students instead look to their community. This creates an interesting paradigm for school improvement; focusing upon the resources a wider community provides to students (as well as their quality) gives a school a new understanding of the services students need, as well as “improv[ing] the quality of relationships between school adults and the students they serve” (pp. 551).

These readings made me reflect upon one personal and very applicable example of the power of student agency, and how difficult it can be to build. During my time with the Arizona Department of Health Services (ADHS), I worked to build and sustain a coalition for youth anti-tobacco advocacy, made of disparate school-based and community-based youth organizations from across the state. Historically, anti-tobacco work with youth in Arizona had heavily focused upon what we called in shorthand the “DARE model:” in-classroom lectures, featuring a figure of authority from the school or greater community who gave a very fact-based presentation. In focus groups with middle and high school students, however, we learned that this model was effective in passing along those facts–that cigarettes are deadly and addictive–but it did not personalize the subject, nor give students a sense of involvement in the cause. The goal of this new advocacy-based coalition was to empower students to understand what policy is, how it affects them and how they could affect it.

This was a radical change in the student-educator relationship, and one of the most difficult pieces to put into play was to demonstrate to these student leaders that they had agency–within their homes, their schools and their communities–and to support them in developing their confidence. Many, at the outset, simply asked for a list of acceptable club activities, without giving much thought to their local environment or personal interests. Definitely putting Howard’s rubric into play, adult educators were a vital piece of building confidence among students to tackle issues of importance to themselves and their peers; these adults, who could be anything from a homeroom teacher to someone working in outreach at the county health office to a volunteer with a community youth program, approached “advocacy” and “student agency” in very different ways; we helped all parties, including ourselves, to reflect upon our own biases, and our own communities, in order to formulate a better way to speak to coalition student leaders. Likewise, as Bautista suggests, we guided these students in the same process, asking them to identify their individual agency, as well as the agency of their local club, and to use that to find projects that were meaningful on both a personal and community level. This conversation was essential; without the wealth of student voices and experience added to the conversation, this coalition would never have risen past the lecture–a figure of authority telling the students what they should do.

Sources

Bautista, M.A. et al. (2013). Participatory action research and city youth: Methodological insights from the Council of Youth Research. Teachers College Record, 115, 1-23.

Liou, D.D., Antrop-Gonzalez, R.A. & R. Cooper. (2009). Unveiling the promise of community cultural wealth to sustaining Latina/o students’ college-going networks. Educational Series, 45, 534-555.

Howard, T.C. (2003). Culturally relevant pedagogy: Ingredients for critical teacher reflections. Theory into Practice, 42(3), 195-202.

Learning from youth

What happens when marginalized and oppressed youth are invited to participate as equal partners in research about their educational experiences?

The Council of Youth Research, based in Los Angeles and highlighted in an article published in the Teachers College Record (2013) , provide compelling testimony. This team of researchers, comprised of students and faculty at both the high school and college level, is doing amazing and innovative research on educational inequity in their city.  The team’s methods, insights, and reports serve as a model for the awesome potential of youth participatory action research.

The authors describe a comprehensive research plan that entails various forms of qualitative research in conjunction with quantitative data analysis.  The research team is also intentional about making sure all voices are included and represented equally.  For me, the most impressive aspect of the team’s work is the variety and creativity of ways in which the youth share their findings.  They use a multimedia approach that includes articles, PowerPoint presentations, video documentaries, and even rap.

I am inspired by the variety of methods the team uses to convey knowledge.  The multifaceted approach is intentionally inclusive, aimed so that a multitude of stakeholders can access the research findings.  It is accessibility at its finest.  This appeals to me both as someone who values inclusion and as a researcher committed to conducting research that benefits people.  Too often, research results are confined to arcane academic journals or conferences that only a privileged minority can access.  Sadly, social science research that actually could make a positive difference is not shared with people in positions of power who could implement interventions.  I am drawn to the idea of disseminating knowledge gleaned from action research in multiple ways so that more people can be exposed to and benefit from it.  Thus, one of the key lessons I learned from reading this article is that it is not only important to use a variety of methods to conduct research; it is paramount to use a variety of methods to distribute research findings.

After reading this article, I am excited about the possibilities offered by youth participatory action research and eager to try it in my own work.  My goal is to ultimately produce an innovative intervention to improve the retention, satisfaction, and success of ASU freshmen, and it seems that the most effective way to do this would be to partner with this population in designing a study to better understand their needs and experiences.  I am convinced after reading this article that involving individuals in all phases of research intended to benefit their community is the best way to achieve success.

My main concern is regarding how I could possibly replicate the practices used by the Council of Youth Research.  The time commitment required by university researchers and the high school students is enormous.  As noted by the authors of the article, Council members met for approximately 40 hours per week.  I wonder how the adults were able to get students to agree to such an extensive time commitment.  Even if I were able to figure out a way to get a group of ASU freshmen to agree to a similar schedule, it would be impossible for me to do so, given my status as a full-time employee.  I wonder what alternative arrangements might be feasible for arriving at results similar to those presented by the Council.

Reference:

Bautista, M., Bertrand, M., Morrell, E., Scorza, D. & Matthews, C. (2013). Participatory
Action Research and City Youth: Methodological Insights From the Council of Youth
Research. Teachers College Record, 115(100303),1­23.

Creating Culturally Relevant Communities of Practice

I have to say…I love Etienne Wenger’s (2000) article, “Communities of Practice and Social Learning Systems.”  Why?, you ask.  Because I have realized how broken our communities of practice are in my community…not only with the administration, teachers and other staff, but with our students and community members as well.

When I think about the various communities of practice that are visibly present, I come up with two very distinct ones, the tribal members (insiders) and the non-tribal members (outsiders).  I navigate between the two communities of practice on a regular basis.   As Wenger (2000) would call me, I am a “roamer” who has the ability to make connections with members of other communities of practice and bring to them the knowledge from other communities.  I can relate to the outsiders because I’m an urban Indian, meaning I grew up in the city.  And, I can relate to the insiders as I am a tribal member.  Wenger (2000) talks extensively about the boundaries that communities of practice create that are both positive and negative.  In this case, oftentimes, the boundaries established by the insiders create a great disconnect between itself and others.  More often than not, the outsiders’ personal experiences and their competence about Native American culture, in my case Tohono O’odham culture, is so disconnected that fostering learning can be very difficult.  The boundaries are not meant to spotlight what you do not know, but the very idea of communities of practice require it (Wenger, 2000).

Is it possible to create a community of practice that involves both the insiders and outsiders?  I am pretty sure we could.  Of course, both the insiders and outsiders would have to connect enterprise, mutuality and repertoire with engagement, imagination and alignment (Wenger, 2000).  This not only applies to the outsiders learning about the culture that they serve (the insider’s culture), but it will require the insiders to understand the different cultures the outsiders bring to the Nation.  I, unfortunately, have only been looking at this from an insider perspective…the outsiders must learn about our students and our community.  I really did not see a value in it being the other way around.  And, now that I have, I am intrigued by the idea of creating a community of practice that involves both sides who truly have an interest in becoming one cohesive group that all have the same goal in mind…providing the best education possible.

These very boundaries and the ability to access an educational community of practice may very well be as the cause of lack of parental support.  Education itself has its own set of boundaries.  Gregory Cajete’s chapter titled, “Seven Orientations for the Development of Indigenous Science Education” in Denzin, Lincoln and Smith’s (2008) book, Handbook of Critical Indigenous Methodologies, Cajete wrote “the sustained effort to ‘educate’ and assimilate American Indians as a way of dealing with the ‘Indian problem’ inevitably played a key role in how American Indians have historically responded to American ‘schooling.'”  He later writes, “early missionary and government teachers naively assumed that American Indians had no education at all and that their mission was to remedy this ‘great ignorance'” (Cajete, 2008).

Unfortunately, the assimilation process that many of our elders experienced in boarding schools has created a great dislike for the education system.  The way the American schools operated were very different than the way Native American’s education system operated.  Native Americans education was “characterized by observation, participation, assimilation, and experiential learning rather than by the low-context, formal instruction characteristic of Euro-American schooling” (Cajete, 2008).  Thus, many of our parents and grandparents (who may be legal guardians) do not care to participate in the communities of practice within the educational system.  They have no vested interest because of the disconnect between their personal experiences and competence in the modern educational system.

By possibly creating new communities of practice that do not initially have a focus on education may be a way to draw in our community members who do not see education positively.  These individuals would have to connect with other community members in the same way that the insiders and outsiders as mentioned above.  Communities of practice cannot make an impact if they do not have buy-in from all members.  As relationships continue to build and mutuality is strengthened by engagement, imagination and alignment, the direction of this new community of practice can begin to shift its focus on educating our youth.  This community should include administrators, teachers and staff (both tribal and non-tribal), parents/guardians, students and community members.  Much of what people learn about what is going on in the community comes by word of mouth.  If we can create a strong community of practice, the word will get out and we can then begin to expand it to reach and include more members.

Redefining communities of practice on our Nation will be critical to changing the mindsets of all administrators, teachers and staff members, as well as community members, in regards to the educational system present on the reservation.  In order for us to build a successful school system, all of us must meet in the middle to ensure that we are preparing our students for the best possible future.  And, who doesn’t want that?

References:

Cajete, Gregory.  (2008).  Seven orientations for the development of indigenous science education.  In N. K. Denzin, Y. S. Lincoln, & L. T. Smith (Eds.), Handbook of critical indigenous methodologies (pp. 487- 496).  Thousand Oaks, CA:  Sage Publications.

Wenger, Etienne.  (2000) Communities of practice and social learning systems.  Organization, (7)2, 225-246.

Keeping Up the Good Fight: Reflections on Writing for a Highly-Specialized Audience

The website “Existential Comics” is one of my very favorites. It has a great nerdish sense of humor, and it gets at some of the more complicated informal components of why studying philosophy can be such a challenge (i.e. is it possible that Kant actually wanted someone to understand the Critique of Pure Reason? Why is symbolic logic such a pain in the neck? Why are there so few women represented in philosophy courses? What’s with all the beards?). In one of their best comics, a handful of philosophers are playing the game Pictionary and arguing the entire time. In one of the panels of the comic, Jeremy Bentham, one of the theorists responsible for the development of Utilitarian philosophy, leans over to Martin Heidegger and says “Oh look at that, you *can* communicate in a way that is comprehensible.” It doesn’t have the same chortle-inducing hilarity to it when writing it out in a blog post, but what really caught me about this missive is that it reminded me that it can be quite difficult to take on a reading for which you may not have been part of the authors’ intended audience.

Huh? Heidegger is known to be a bit challenging to understand.  Source www.rugusavay.com

Huh? Heidegger is known to be a bit challenging to understand. Source: www.rugusavay.com

Take the article on Flores v. Arizona published by Thomas et. al out of Arizona State University in the Policy Futures in Education Volume 12. The article is an interesting one for a number of reasons, but for the purposes of this entry, I’d like for the readers here to think about the audience for the article; let me give a brief summary. The authors are writing about a landmark case in education out of Arizona; the case (Flores v. Arizona) is centered on whether or not a public school is obliged – via U.S. Civil Rights law – to provide instruction that supports English Language Learners (ELL). There is a brief description of the demography and language characteristics of the state of Arizona, a detailed legislative and legal history of the lawsuit/case, a comprehensive description of the data methods and motivations, and orientation on the scholarly research that applies to the case, and as a bridge to the conclusion, a lengthy detailing of neo-liberalism, its challenges, shortfalls, and its morphology in U.S. politics.

The sections on demography and language policy are clear. The growth of the Latino/a population in Arizona is substantial and is projected to rise, and many of the children in these families will be ELLs, so there is a clear need for more resources and planning to ensure that language acquisition instruction is provided to the populations that need it.  Similarly, the history of the Flores v. Arizona case is straightforward and is replete with references to literature that supports the history of the case and the media attention it has received. The section on methodology comes as a mildly jarring turn: a discussion of the technical details of how research was conducted on the media attention on the case. There is a lengthy section here intended to orient the reader to the research literature, too, with a multitude of references.

Here’s where the fun really begins, though, in the sections on neo-liberalism. The writing is at once descriptive of the general political-economic philosophy of neo-liberalism and also the way it is intertwined in the developmental arc of U.S. politics in the last 30-40 years. There are sections describing the central problems of the neo-liberal state, the academics who developed the theory, and the characteristic markers of a neo-liberal state in the way its legislation and cultural posture foster the creation of a specific set of values (rather than serving as a moderator of citizens’ civil liberties).

The conclusion comes down to making a connection between neo-liberalism and the education-legislative footprint of Arizona as viewed through the lens of the Flores v. Arizona case. One comes away from the article with a sense that of what neo-liberalism is, what its flaws are, and how it’s connected to Arizona legislation. Also, the Flores v. Arizona case is used as a tool to demonstrate that neo-liberalism is a narrow window through which to view civil rights and education.

The connection here between the cartoon where Bentham gives Heidegger a hard time and the article is around finding an audience. While reading the article, I found myself wrinkling my brow, wondering why the authors continued to pepper the writing with the word “discursive” and undefined phrases like “appropriate education.” I kept finding myself wondering when the authors would say something along the lines of “we argue,” or “based on our research findings, we contend,” or some other clear marker of a formal argument being made. I read the article a second time, more slowly, making notes along the way to ensure I hadn’t missed something while looking for the argument.

What I realized is that I am approaching the article, and all of the readings for that matter, from a source of bias. I have a mental model, a framework of understanding academic literature, that requires me to be able to pull out a contestable thesis, and after thinking carefully about the article, I am left with the following possibilities: (1) the argument in the article is that the description of the demography, legislation, case history, and neo-liberalism is valid, or (2) that there should be non-neo-liberal ways of viewing civil rights cases such as Flores v. Arizona.

Neither one of those arguments is contestable, to my mind, and so I imagine that I’m missing some critical reflexivity, some sense of the community of practice in higher education research, and some general understanding of the audience of the article. Heidegger isn’t terribly accessible, but if you can be professionally socialized into the field of Continental philosophy, his work on defining what it is to “be” is fascinating, and is critical to the development of the field of existentialism. Similarly, I am reminded that the literature in the field of higher education will be a new journey, and one that will require me to develop a new set of analytical frameworks to appreciate and understand the efforts therein.

As for finding an audience, this article is a reminder that as I write, considering my audience will be critical to my success; as I conclude, I wonder if I’ve been able to hit the mark in that regard.  The notion that “access” is a multifaceted concept is, I hope, one that doesn’t require much qualification.  But as I process this piece of writing, especially in the context of the literature I’ve read recently on communities of practice, part of the “access” conversation, I would contend, includes the concept of making your arguments accessible as a scholar to those who would both follow in your footsteps and those who would use your work to influence policy.

REFERENCES

Melinda Hollis Thomas, Dinny Risri Aletheiani, David Lee Carlson, Ann Dutton Ewbank (2014). ‘Keeping Up the Good Fight’: the said and unsaid in Flores v. Arizona, Policy Futures in Education, 12(2), 242-261.

Who are you anyway?

Identity is personal and collective and influences one’s life at all levels.  Wenger (2000) suggests that our identity is shaped by participation in social learning systems – from families to work to school – and that it needs to have a strong foundation balanced with an ability to expand. Participating in social learning systems includes a sense of engagement with others, an ability to reflect on the system and consider alternatives, and a sense of purpose or alignment.  By knowing who we are, we are better able to imagine, investigate, respond, plan, and question.

College can be its own social learning system.  It certainly is a time for identity development. (Chickering and Reisser 1993).  Our readings this week seem aimed to drive home the point that all perspectives (especially of the underrepresented) have value.  My goal as an educator in the community college system is to empower the students with whom I work to know themselves – to establish identity – and to learn how to create personally meaningful goals and opportunities.  To do this effectively, I need to be aware of myself as well.

Who am I?  I consider myself as one who serves others and who works toward social justice.  The Jesuit university I attended helped me to develop that identity which was solidified in a year of volunteer service after college graduation with an organization called the Jesuit Volunteer Corps.  The Jesuit Volunteer Corps gives young adults an opportunity to work toward social justice while living a simple lifestyle in community with others who serve – all with an openness to exploring spirituality.

Who are the students I serve?  They are people of all ages with diverse experiences.  Some desire to make someone proud or to pave the way for a younger sibling, most have hopes and dreams of participating in our consumer culture and/or of making a difference in their future work.  Arthur Chickering’s work on the identity development of college students, suggests that college is a time to develop competence, interdependence, integrity, purpose (Chickering and Reisser, 1993).  In order to successfully navigate the college system and one’s own development, some know-how is needed.  Unfortunately, students from poorer backgrounds are often denied that know-how.  They may, however, have other forms of capital such as aspirational, social, or familial (Yosso in Liou et al 2009).  Our job as educators is to increase our student’s capital to cross boundaries and achieve success so each can align with her/his unique goals.  I believe it’s my moral imperative as a person whose conscience was formed with exposure to social justice.

Consider higher education institutions as communities of practice.  Not all communities are equal, though.  Communities with a balance of engagement, mutual relationships, and a repertoire of artifacts (e.g. language, rituals, etc.)  are more competent than others.  Without access to the artifacts or relationships of mutuality students will have a harder time succeeding.  If students can’t rely on their teachers to engage with them and share the resources necessary to succeed, then we all suffer the consequences of having a divided society of have’s and have-not’s.

Looking inward to better understand the self allows for more authentic engagement.  Sharing that understanding allows for more equal access.  As I model and encourage students to do the same we will all have access to a greater repertoire of resources and artifacts and improve self-efficacy as we move closer to our goals.  Who are you?

Chickering, A.W. & Reisser, L (1993). Education and Identity (2nd ed.). San Francisco, CA: Jossey-Bass.

Liou, D., Antrop-Gonzalez, R. & Cooper, R. (2009).  Unveiling the Promise of Community Cultural Wealth to Sustaining Latina/o Students’ College-Going Information Networks. Educational Studies, (45), 534-555.

Wenger, E (2000). Communities of practice and social learning systems. Organization, 7(2), 225-246.