DEVELOPMENTAL ALGEBRA ASSESSMENT: CLOSING THE LOOP UNDER SHIFTING EXIT STANDARDS
Alice Welt Cunningham and Kathleen Markert Doyle
ABSTRACT
This paper summarizes the Mathematics Department's efforts, despite changing University-wide exit standards, to assess and improve student learning in the College's Elementary Algebra course. As over three-quarters of Hostos' students enter requiring remediation in mathematics, improving student success in this gateway course is essential to increased retention and graduation rates.
Of our three assessments to date, the first was completed in the Fall 2011 semester using Hostos' previous Departmental learning outcomes. However, in 2012 CUNY issued new algebra standards, coupled with a University-wide exit examination. Item analyses for the new rest are not available for individual institutions. Thus, the Department conducted its two subsequent assessments using its own midterm exams aligned to the new standards. While the midterm analyses show unusual improvement in factoring, the midterm exam is given directly after this topic is taught. By contrast, ocher weaknesses parallel the university-wide findings. The paper describes steps already taken and chose in train to fun her improve student performance.
INTRODUCTION
Using student learning outcomes assessment to improve student performance has become the Touchstone of successful institutional progress. In Beyond Crossroads, the American Mathematical Association of Two-Year Colleges pre scribes a six-step assessment cycle requiring the continuing redefinition of student learning goals based on assessment results ac the classroom. course, and program level (AMATYC, 2006, p. 15, 29). Similarly, the Middle States Commission on Higher Education cites "a culture of 'continuous improvement"' as the criterion seminal to the determination of institutional effectiveness (Middle States, 2007, p. 17), with assessment of student learning at the heart of chat process (Middle States, 2006. p. 63).
This paper describes the Hostos Mathematics Department's first three course level assessments in Elementary Algebra (Mach 20). Throughout the City University of New York, passing this course constitutes the pre-requisite to credit-bearing college-level work and therefore to retention and graduation. As demonstrated by a recent five-year lookback, over three-quarters of Hostos' students enter needing remediation in mathematics (Hostos Self-Study Report, 2012, Appendix 9.1, p. 250), a situation that prevails as of this writing (Hostos OIR Student Profiles, 2014). Thus, improving performance in this gateway course is integral to improving retention and graduation races overall. This paper summarizes the Mathematics Department's ongoing attempts to analyze student performance since Fall 2010 notwithstanding the University's recent revision of its exit-from-remediation algebra standards (CUNY Mathematics Panel Recommendations, 201 2).
The University's new exit standards were issued early in 2012, effective immediately, and have been modified a number of rimes since then. As can be seen from a comparison of the old and new course learning outcomes (attached as Appendices A and B, respectively), the new standards introduce the topics of inequalities and function notation and place a heavier emphasis on two-seep factoring and scientific notation problems. Proportions and percents, formerly not addressed at this level, are addressed in the context of word problems.
Pursuant to University mandate, student learning is now measured by a University-wide final exam (the CUNY Elementary Algebra Final Exam, or CEAFE). However, individual item analyses for the new final exam are unavailable on a disaggregated college- by- college basis (CUNY Office of Institutional Research, 2013). Therefore, while the Mathematics Department's first assessment (completed in Fall 201I) was based on the Fall 2010 Departmental Final Exam and related learning outcomes, the two subsequent assessments (in Spring 2012 and Spring 2013) were based on the Departmental Midterm Exam, for which the Department is able to prepare the type of question-by-question analyses not yet available for the new University-wide final. Because the Departmentally-prepared exam is a midterm rather than a final, iris administered by the tenth week of the 14-weeksemesrer. The midterm thus covers only the first 10 of the 14 learning outcomes based on the new standards (see Appendix B).
In order to permit a comparison with the results of the first assessment, the second assessment was aligned to the earlier learning outcomes. The Spring 2013 assessment described here therefore represents the first to use learning outcomes based on CUNY's new University-wide standards. Despite the changing exit-standards, many of the exam questions remain the same, thus permitting comparisons and conclusions.
THE DATA
Our initial assessment was performed by hand by Hostos Office of Institutional Research based on a representative sample of Departmental pencil-and -paper exams. By contrast, both the second and third assessments, based in each case on the multiple-choice Departmental Midterm, were graded by Scantron, with all exams taken into account. Nevertheless, comparing similar questions from each of the exams perm its performance comparisons.
The data summarized below report the results of the most recent assessment, with comparisons to the two earlier analyses where possible. The results reflect Scantron item analyses for 605 students on four forms of the exam, with the forms distributed approximately equally among the examinees. In keeping with the first two assessments, the following Department-wide assessment standard was used to determine whether a learning outcome was met:
60% or above correct: S+ Above Satisfactory
50-59% correct: S Satisfactory
40-49% correct: N Needs Improvement
Below 40%: U Unsatisfactory
Revision of these standards to reflect the new University-prescribed 60% passing cut point currently is under way.
The data are summarized in several tables. The first cable reports results by individual learning outcomes, listing all exam questions applicable to each such learning outcome. For learning outcomes involving more than one question, the remaining five cables break down those results on a question-by-question basis, as follows: (a) linear equation application problems (SLO #4; Table 2); (b) literal equations (SLO #5; Table 3); (c) exponential expressions, including scientific notation (SLO #7; Table 4); (d) operations with polynomials (SLO #8; Table 5); and (e) factoring (SLO #9; Table 6).
Table 1: Analysis of student performance by learning outcome
|
|
|
|
|
---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Table 2: Linear equation application problems (SLO #4)
|
|
|
|
|
---|---|---|---|---|
| Proportion (1st analogous part missing) |
|
|
|
| Proportion (2nd analogous part m1ssm) |
|
|
|
| %increase/decrease (find new amount, then percent change) |
|
|
|
| % increase/decrease (find percent, then new amount) |
|
|
|
| Translate word problem Into equation |
|
|
|
Table 3: Literal equations (SLO #5)
|
|
|
|
|
---|---|---|---|---|
| Multiplication/division prmc1pleof equality | 52 |
| s |
| Addition/subtraction principle of equality | 65 |
| S+ |
Table 4: Simplifying exponential expression (SLO #7)
|
|
|
|
|
---|---|---|---|---|
| Division with 2-step scientific notation |
|
|
|
| Exponential fraction with negative exponents |
|
|
|
Table 5: Operations with polynomials (SLO #8)
|
|
|
|
|
---|---|---|---|---|
| Subtracting one 3-term polynomial from another (distributive property) |
|
|
|
| Division of 3-term polynomial by a monomial |
|
|
|
Table 6: Factoring polynomials (SLO #9)
ANALYSIS
As the three assessments involved three different exams and two different sets of learning outcomes, no precise comparison is possible. Moreover, University-wide data, which is not disaggregated on a question-by-question basis for individual colleges, was available as of this writing only for the first University-wide final exam in Fall 2012. Nevertheless, question-by-question analyses do permit some conclusions. The following discussion highlights learning outcomes where 5wdenr performance increased; those where student performance decreased; and those for which, because of the University's new exit standards, no comparative data are yet available.
INCREASED PERFORMANCE
Factoring. While the Fall 2011 final exam assessment showed pronounced student weakness in factoring polynomials (28 .4% correct response race), both the Spring 2012 and Spring 2013 Departmental. Midterms show dramatic improvement in that area (67% and 69% correct, respectively). However, as the Midterm is administered directly after this topic is taught, it may be that the improved student performance reflects short-term procedural absorption rather than long-term conceptual understanding (e.g., National Research Council, 2001). This interpretation is supported by the University-wide results, which reflect a success rate in this area of less than 60% (CUNY Office of Institutional Research, 2013). Thus, the increased factoring results on the Hostos Midterm, while impressive, are suggestive rather than dispositive of student improvement. Because college-by-college performance results on individual CEAFE questions are unavailable, analysis of our students' performance in this area at semester's end is not yet possible.
Inequalities. Hostos' pre-existing rubrics did not address this learning outcome, thus precluding a comparison of previous results on this topic. However, the 2013 Departmental Midterm shows student improvement over the 2012 Midterm from 38% to 45% (SLO #6, Table 1 above). The current result is in line with CUNY wide performance on this rubric on the Fall 2012 CEAFE of 47%. This increase represents an improvement of over 18% (a 7-percentage point increase calculated as a percent of the earlier 38% performance). This result is particularly impressive in that the 201 3 Midterm, unlike its 2012 predecessor, requires not only solving the inequality bur also indicating the correct answer by selecting the appropriate graph, thus making the question more difficult. Accordingly, while student performance on this new rubric remains low, the improvement is nonetheless worth noting.
DECREAS ED PERFORMANCE
By contrast, student performance on questions involving operations with scientific notation fell from 42% on the 2010 Departmental Final under the pre existing rubrics and 51% on the 2012 Midterm co 23% on the 2013 Midterm. The Fall 2010 question was not analogous, while the 2012 problem represented only a one-seep rather than a two-step calculation. Thus, this decline in performance may be attributable to the increased degree of difficulty introduced for this learning outcome by the new standards, which require a two-step analysis for a result written in scientific notation. (For example, the fraction produces the quotient, which must be transformed to in order to reflect scientific notation.) For purposes of comparison, CUNY-wide performance on this learning outcome averaged only 27.8%, the least successful result of the 25 question on that exam. Thus, Hostos' 23% performance rate, while needing improvement, is not out of line with the CUNY-wide results.
OUTCOMES NOT PERM ITTING CURRENT COMPARISON
Two areas not addressed in the two previous assessments are word problems involving proportions and percent increase and decrease (SLO #4, Table 2).
Proportion. As this learning outcome was introduced by the 2012 CUNY wide standards, it was not addressed by either of our previous assessments. Thus, no comparative data are available. Current student performance (at 54% and 60% on questions 4 and 9, respectively) meets the pre-CEAFE Departmentally-established criterion for satisfactory performance (50-59%). Subsequent assessments under the new exit-standards should present a clearer picture.
Percent Increase and Decrease. The same is true of the new two-step percent increase and decrease problems, which require finding either a new amount and then the percent change, or finding the percent change and then the new amount (Questions 10 and 12, respectively). Again, while student performance, at 55% and 57%, fell within the Departmentally-determined satisfactory range (50-59%), further monitoring is required.
DEPARTMENTAL ACTIONS TO FOSTER IMPROVEMENT
Following CUNY's issuance of the new rubrics in January 2012, the Mathematics Department took the following additional seeps in order to improve student performance:
Updating the course syllabus and day-by-day teaching guide to keep all 30+ sections on track.
Continuing revision of the Depart mental Midterm to focus on the types of problem found on the CEAFE.
Preparing new Departmental worksheets in both English and Spanish to reflect the emphases of the new rubrics.
Preparing a new workbook for classroom use.
Introducing and assessing supplemental instruction (group learning sessions led by an advanced student "peer leader''), from the Fall 2012 semester onward.
Working toward a mandatory "multiple repeaters" section with additional support systems.
SUMMARY AND CONCLUSIONS
So far, the Department's most recent assessment shows the best student performance in factoring, at 69%, and the worst performance in scientific notation, ac 23%. The latter result, which mirrors the CUNY-wide 28% correct response race, apparently reflects the two-step aspect of the calculation. Finally, while student performance on the two new applications of linear equations (proportions and two-step percent increase and decrease problems) qualifies as satisfactory, analyses of student progress in these areas muse await additional assessments. Until the University makes CEAFE item results available on a college-by-college basis, the Department plans to continue assessing student performance using the Departmental Midterm.
REFERENCES
American Mathematical Association of Two-Year Colleges (2006). Blair, R. (Ed.),
Beyond crossroads: Implementing mathematics standards in the first t1110 years of college. Memphis: Southwest Tennessee Community College (amatyc@ amatyc.org).
Cunningham, A.W. & Doyle, K. (2012). Spring 2012 Mach 020 Midterm Assessment. NY: Hostos Community College.
City University of New York Mathematics Panel Recommendations (January 2012). Elementary Algebra Proficiency Exam Topics, Fall 2011. N Y: Author (generally circulated by email correspondence of January 18, 2012, and revised by email correspondence of April 23, 2012, with sample final examinations circulated August 21, 2012).
City University of New York Office of Institutional Research (2013). CUNY Elementary Algebra Final Exam: Fall 2012 Item Results and Learning Outcomes Correspondence. NY: Author.
Hostos Community College (2012, February). Institutional Self-Study Report.
NY: Author.
Hostos Community College Mathematics Department (June 2012).
Academic Program Review. NY: Author.
Hostos Office of Institutional Research (2011). Math 020 Fall 2010 Analysis of SLO's on Final Exam (revised, 12/ 1/11). NY: Author.
Hostos Office of Institutional Research (2014, April). Student Profiles from Fall 2002-Spring 2014. NY: Author. Retrieved April 12, 2014, from hrcp://www. hostos.cuny.edu/oaa/oir/PublicDocuments/StudentProfile.pdf.
Middle States Commission on Higher Education (2007). Self Study: Creating a Useful Process and Report, 2nd ed. Philadelphia: Author.
Middle $me Commission on Higher Education (2006). Characteristics of Excellence in Higher Education. Philadelphia: Author.
National Mathematics Advisory Panel (2008). Foundations for Success: The Final Report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education.
National Research Council (2001). J. Kilpatrick, J. Swafford, & 8. Findell (Eds.), Adding it up: Helping children learn mathematics, Chap.4. Mathematics Learning Study Committee, Center for Education, Division of Behavioral and Social Sciences and Education. Washington, D.C.: National Academy Press.