**Update - This has been approved at the first stage. Once we write the paper, we will know if it is to be published.
Once again, any help would be greatly appreciated :)
Once again, any help would be greatly appreciated :)
We’re Up 17% - What
Now?
Abstract:
From
2010 to 2017, we have raised our developmental education success rates at
Moraine Valley Community College by more than 17%, while lowering our
withdrawal rates by 30%. More importantly, we have done so while we
simultaneously increased the acceleration of some of our programs and created
bridge courses, often skimming the stronger students off of our upper-level
courses. Initially, we utilized global data (i.e. our success rates, the
success rates of our students at the next levels, our attendance rates, our
grade patterns, etc.) to suggest policies and curricular alignment. At this
point in the process, we are attempting to turn to more local data, focusing on
individual performance and the appropriate resources we can provide our faculty
and staff. This paper will address the strategies we developed to address our
low success rates, as well as the next steps we plan to take in our continuing
improvement process.
Description and Learning Outcomes:
In the
course of the past seven years, we have raised our developmental education
success rates at Moraine Valley Community College by more than 17% (i.e. those
students earning grades of A, B, or C, as opposed to D, F, or I) and have
lowered our withdrawal rates by 30%. It has been a systematic, “soft data-informed”
process. We began by confronting our general retention data and reviewing our
policies and procedures. For instance, in order to introduce the possibility of
adopting a departmental attendance policy, we first conducted a survey with our
faculty asking them to code all their assigned F grades and to note which
instances were due primarily to attendance. Our faculty reported that more than
70% of the F grades we had awarded were due to low attendance. This began a
gradual implementation of data into our analysis, planning, and evaluation
processes.
As we
began to work on our retention plan, we were faced with an impending migration
from COMPASS to ACCUPLACER as our primary placement test. We capitalized on
this challenge by analyzing all of our available data in regards to student
placement, matriculation rates, and future success. In doing so, we discovered
large overlaps in curricula between academic levels, severe compression of
grades in some courses, and the “over-success” of our “A and B” students moving
to credited coursework. As a result, we have developed new metrics to track the
efficacy of our placement instruments, how well we transition students from
lower levels in the curricula, and how well they persevere into and through
their credited sequence.
While
evaluating our programming, we discovered that we suffered from some unusual
side-effects of our efforts to be innovative: We began to develop so many
interventions that we lacked the ability to assure that the right student entered
the right intervention (RSRI- Right Student, Right Intervention). Subsequently,
we began to work with advisors and other stakeholders on campus to help
delineate our offerings and to ensure students would know if they were a right
fit for a particular type of course or intervention.
As a
result of the analyses we conducted, we collaboratively designed an attendance
policy, reevaluated our curricular transitions, worked to ensure students had
their textbooks and required materials, and involved other critical stakeholders
on campus in order to create a more consistent and visible pathway for our
developmental students. Our efforts have led, in small part, to the creation of
enrollment and grade dashboards with our Institutional Research department, as
well as an ongoing relationship that has led us to many other questions,
challenges, and resources.
Finally,
we are preparing to move from larger data sets to local indices to share with
instructors at the classroom level. Doing so should provide our faculty with the
appropriate feedback for course improvements that will ensure that we maintain
and raise our improved success rates.
Learning Outcomes – Participants will
review their own transitional and longitudinal data process; Participants will
learn about new retention and perseverance metrics.
No comments:
Post a Comment