Skip to main content

Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities: Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities

Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities
Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities
    • Notifications
    • Privacy
  • Issue HomeJournal of Interactive Technology and Pedagogy, no. Issue 26
  • Journals
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities
    1. Introduction
    2. The Present Study
    3. Method
      1. Course Section and Student Characteristics
      2. Online Asynchronous Assignments
        1. Noba textbook quizzes
        2. Qualtrics homework assignments
        3. Excel worksheet activities and content acquisition podcasts
        4. Embedded vocabulary quizzes
      3. Additional Measures
        1. Demographic questionnaire
        2. Statistical knowledge test
        3. Statistics anxiety scale
        4. Self-efficacy for online learning
        5. Device used
        6. Software installation and access
    4. Results
      1. Preliminary Analysis
      2. Noba Textbook Quiz Scores
      3. Embedded Vocabulary Quizzes
      4. Qualtrics Homework Assignments
      5. Excel Worksheets
    5. Discussion
    6. References
    7. Acknowledgements

Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities

Nicolas Zapparrata, The Graduate Center, CUNY

C. Donnan Gravelle, The Graduate Center, CUNY

Elizabeth S. Che, The Graduate Center, CUNY

Arshia K. Lodhi, The Graduate Center, CUNY

Peter J. Johnson, The Graduate Center, CUNY

Raoul A. Roberts, The Graduate Center, CUNY

Riya Anjaria, The Graduate Center, CUNY

Patricia J. Brooks, The Graduate Center and The College of Staten Island, CUNY

Abstract

The ability to evaluate and interpret data is a critical skill in psychological science. In this paper, we report and investigate how a department-wide effort to promote quantitative reasoning, data analysis and visualization, and interpretation of research findings in Introductory Psychology through low-stakes online asynchronous assignments supported student engagement throughout the semester. Undergraduates (N = 1523) enrolled in 25 course sections at a nonselective minority-serving public college were assigned weekly quizzes on textbook modules and homework assignments encompassing statistical reasoning problems, scientific abstracts, instruction on performing database searches, and TED talks, with embedded multiple-choice quiz questions assessing comprehension of statistical and general psychology vocabulary. Six of the nine homeworks included content acquisition podcasts (CAPs) (Kennedy et al. 2016) demonstrating how to manipulate psychology-relevant data in Excel, with step-by-step instructions for calculating basic statistics, organizing data into tables, and generating graphs. Most (82%) of the students who submitted the first and at least one other assignment (N = 1061) completed the course with grades C or higher. For students who completed the course, prior statistical knowledge was positively associated with all measured outcomes of student engagement and learning (number of homework and Excel assignments submitted, and both textbook quiz and vocabulary knowledge scores); self-efficacy was positively associated with textbook quiz scores and Excel submissions. Scores on the embedded vocabulary quizzes were positively associated with textbook quiz scores and with submissions of both homework and Excel assignments. Notably, anxiety about statistics had no relation to learning outcomes. Taken together, findings indicate the feasibility of incorporating quantitative reasoning and data science in Introductory Psychology. Further efforts are needed to reach students who fail to engage with the curriculum.

Keywords: quantitative reasoning, formative assessment, student engagement, active learning.

Introduction

The foundation of modern psychology is the scientific study of behavior and mind. As such, the ability to utilize statistics to interpret data and evaluate claims is a fundamental skill for undergraduate psychology students to master (Lutsky 2006). The American Psychological Association (APA) Guidelines for the Undergraduate Psychology Major emphasize statistical literacy as a necessary component of psychology coursework at all levels (APA 2023). To meet the guidelines, instructors need to create lessons that help students understand and communicate research findings, calculate basic statistics, manipulate and visualize datasets, and evaluate information presented in varied formats, including graphs and tables. Creating opportunities for students to engage in quantitative reasoning may help students distinguish evidence-based claims from subjective opinions and pseudoscience (Johannssen et al. 2021; Lilienfeld et al. 2009). This is of considerable importance given the prevalence of misleading and deliberately false information shared via the internet and other media (Ridgway et al. 2017). Quantitative reasoning is also valuable in shaping students’ career paths as well. According to the National Association for Colleges and Employers (NACE 2021), critical thinking abilities, such as solving problems and accurately summarizing and interpreting data, are essential skills for college students entering the job market. Also listed among the NACE competencies is the ability to use technology (e.g. software, computers) effectively to complete specific tasks and improve efficiency and productivity.

Research on teaching science, technology, engineering, and mathematics (STEM) subjects, including psychology, indicates learning outcomes improve when students are engaged in “hands-on” active learning as opposed to traditional lecture-based instruction (Freeman et al. 2014). Active learning provides students with opportunities to engage in authentic and meaningful problem solving and discovery; it is most effective when learning is scaffolded through demonstrations, hands-on step-by-step tutorials, and feedback (Alfieri et al. 2011). In prior work, Screencast video tutorials, which are screen recording technologies, were used in upper-level psychology statistics courses to teach students how to analyze data using Statistical Product and Service Solutions (SPSS) software (Breneiser et al. 2018; Lloyd and Robertson 2012). Results indicated that SPSS recorded tutorials were effective in supporting students’ grasp of statistics across both in-person and online course formats. Despite these positive findings, SPSS is an expensive software and may not be as accessible to students compared to other software tools, such as Excel. Further, this prior work focused on upper-level courses, whereas efforts should focus on teaching statistical literacy in introductory courses in order to enhance access to practical skills early on in the students’ academic career. Therefore, we sought to expand this approach by implementing similar activities in an introductory-level psychology course, while also increasing access to relevant technology by using Excel. Introductory Psychology (also known as PSY100) is a general education course taken by approximately 1.6 million students annually in the United States (Gurung et al. 2016), including many who do not pursue additional coursework in psychology. This course is thus an ideal setting for developing quantitative reasoning skills not just for psychology majors, but for students pursuing a broad range of majors and future careers.

Brooks et al. (2024) provided an initial report on our department-level efforts to promote quantitative reasoning, interpretation of research findings, data analysis, and visualization skills in PSY100. Students enrolled at a non-selective minority-serving public college were introduced to statistical literacy concepts through a series of homework assignments created with Qualtrics survey software. In a subset of the assignments, students were shown how to engage with psychology-relevant data using Microsoft Excel software. The instructional team used Excel rather than SPSS or R because students have institutional access to this application and Excel promotes a broader application of skills beyond the classroom. Software used to introduce statistical literacy should be both affordable and user-friendly (Adams et al. 2013). Robust statistics software applications like SPSS are user-friendly but also prohibitively expensive, and while R is a free open-source alternative, it requires a steep learning curve for developing coding skills, rendering it inappropriate for PSY100. Excel strikes a balance, as it is both fairly inexpensive and relatively easy to learn (Ozgur et al. 2015). In the present implementation, Microsoft Office 365 (including Excel) was made available to all students at no cost through a university subscription. We introduced Excel as a tool for data manipulation, analysis, and visualization through a series of instructor-guided screencast videos, also known as content-acquisition podcasts (CAPs) (Kennedy et al. 2016), that demonstrated use of Excel with psychology-related datasets. Students were also given instructions on how to download Microsoft 365 at no charge by accessing the university subscription.

In Brooks et al. (2024), PSY100 students’ engagement with statistical literacy lessons was positively associated with their statistics knowledge, which was assessed using items adapted from publicly available AP Statistics practice tests and the New York Times “What's Going On in This Graph?” column (New York Times in partnership with the American Statistical Association, The Learning Network 2021), statistics anxiety, which was assessed through items adapted from the statistics anxiety scale (STARS) (Teman 2013), and self-efficacy for online learning (measured at the start of the semester). While statistics knowledge predicted completion of the Excel worksheets, statistics anxiety and self-efficacy predicted homework completion in general. That is, students with greater statistics anxiety and higher self-efficacy for online learning submitted more of the Qualtrics assignments than their classmates who were less anxious and/or had lower self-efficacy. These relevant factors have been noted elsewhere in the literature (Carpenter and Kirk 2017; Foley et al. 2017). Faculty in the U.S. often express concerns about the mathematical readiness of first-year college students and the need for remediation (Er 2018). Such concerns have been amplified since the COVID-19 pandemic, with widespread reports of learning losses in mathematics as well as reading (Toness and Lurye 2022). Students enter college with varied levels of statistical knowledge. Correct understandings coexist with misconceptions and gaps in knowledge (Cook and Fukawa-Connelly 2016). We assert that such variation in statistical knowledge may play a role in their willingness to attempt statistics-related assignments and their performance. Zeidner (1991) defines statistics anxiety as a type of performance anxiety characterized by physiological arousal and negative emotions (e.g. worry) when presented with statistical content. Statistics anxiety, and math anxiety more generally, may exert negative influences on students’ academic achievement and motivation (Chang and Beilock 2016; Pekrun et al. 2017), though not all studies have found this link (Brooks et al. 2024; Trassi et al. 2022). Similarly, several studies have found positive impacts of self-efficacy on course outcomes and student learning in general (Alqurashi 2016; Gurung and Stone 2023; Tsai et al. 2011), though not all studies have found this relation (Gravelle et al. 2024). Given the mixed findings, the present study examined links between prior knowledge of statistical concepts, statistics anxiety, self-efficacy, and learning outcomes.

Besides internal factors such as statistical knowledge, statistics anxiety, and self-efficacy, other barriers to student learning include technological difficulties and limited access to computers for schoolwork (El Mansour and Mupinga 2007; Gravelle et al. 2024). In prior research (Powers et al. 2016), PSY100 students reported challenges in accessing and using instructional software in a hybrid course with an online mode of instruction, despite instructors dedicating considerable class time to helping students set up accounts and troubleshoot difficulties. Such findings suggest that college students may lack familiarity or savvy with software installation and licensing, and may become frustrated when difficulties arise. In another study conducted at our institution (Gravelle et al. 2024), approximately 9% of PSY100 students reported using hand-held devices (i.e., smartphones or tablets) to complete online coursework rather than desktop or laptop computers. In that study, use of hand-held devices for schoolwork was associated with lower rates of passing the course, fewer assignments submitted, and lower test scores. Similarly, Brooks et al. (2024) found that students’ self-reported use of hand-held devices to complete assignments was associated with a lower rate of submitting the Excel worksheets.

Motivating students to engage with demanding coursework is an ongoing concern in higher education, especially at non-selective colleges serving diverse populations. While such institutions provide broad access to higher education, they are subject to high rates of course attrition, chronic absenteeism, low graduation rates, and declining enrollments (National Student Clearinghouse Research Center 2022, 2024), likely related to multiple social and economic factors (Porchea et al. 2010; Zengilowski et al 2023). Many high school graduates question the value of a college education given its high cost (Belkin 2021; Tough 2023) and they may continue to feel ambivalence even if they choose to enroll. While these issues predate the COVID-19 pandemic, the trends appear to have worsened in recent years (Bulman and Fairlie 2022). To address the high attrition and declining enrollments at community colleges and other non-selective institutions, academic faculty and departments need to work together to develop flexible curricula that support learning in ways that are relevant to students’ career goals and broader interests. Students benefit from academic supports that promote engagement through skills development, feedback, and use of technology (Collins et al. 2022). More generally, instructors can support students by helping them develop transferable skills such as statistical literacy in PSY100 to support critical thinking and technological skills that extend beyond the field of psychology and are useful for the workplace and everyday life. Instructors can also support students by adopting free Open Educational Resources (OERs) and institutionally accessed technology to reduce financial barriers that affect many students enrolled at non-selective colleges, open-admissions institutions, etc. (Colvard et al. 2018).

The Present Study

The present study focused on supporting statistical literacy of PSY100 students at a non-selective minority-serving public college, with the aim addressing a noted limitation of our prior research. A major limitation of Brooks et al. (2024) was that it focused exclusively on work submitted without any objective measures of students’ understanding of the psychological and statistical concepts embedded in the lessons. Hence, the present study examined students’ comprehension of course materials via intermittent quizzes to gain further insight into students’ grasp of course content as they progressed through the curriculum. One set of quizzes focused on content taken from Noba Introductory Psychology modules, an open-access textbook available free of charge to students (Diener Education Foundation n.d.). A second set of quizzes provided a formative assessment of students’ comprehension of statistical and general psychology vocabulary terms embedded in the online homework assignments. Taken together, the two sets of quizzes allowed us to examine students’ comprehension of course material and ascertain whether the curriculum and instructional materials were appropriate for general education students. In an effort to identify potential barriers to student success and learning outcomes, we examined students’ prior statistical knowledge, statistics anxiety, self-efficacy for online learning, software installation and access, and devices used as predictors of learning outcomes. Based on prior literature, we hypothesized that each of these factors would be related to students’ quiz scores and rates of submission of assignments.

Method

Course Section and Student Characteristics

Course outcomes assessment data were collected from undergraduates enrolled in 25 course sections of PSY100 taught in Fall 2023 and Spring 2024 at a non-selective (open admissions) minority-serving public college in the Northeastern United States. Course sections were taught by different instructors following a uniform syllabus with instructional materials posted to a learning management system. The research protocol was classified as exempt by the university’s Institutional Review Board. Following best practices for open science, the following instructional materials are publicly available in an Open Science Framework (OSF) repository (Zapparrata et al. 2025): course syllabus, Noba quizzes, online Qualtrics assignments, and Excel worksheet activities. Additionally, the OSF repository includes a de-identified datafile and the R analysis script.

The course sections comprised 20 regular sections with < 50 students and 5 large sections with > 135 students. Eighteen sections met in person (15 regular, 3 large enrollment) and 7 sections met synchronously online via Zoom (5 regular, 2 large enrollment). The analytic sample (N = 1061) consisted of students who submitted the first homework assignment (HW1) and at least one other assignment, thus demonstrating a minimum level of engagement with the asynchronous online components of the course; see Table 1 for a breakdown of demographic characteristics as self-reported by students in HW1. The analytic sample comprised 69.7% of the enrolled students (N = 1523), i.e., roughly 30% of students did not turn in HW1 and at least one other assignment. Of the 1061 students in the analytic sample, 866 (81.6%) completed the course with a passing grade (C or higher), and 195 (18.4%) received D/F grades or withdrew from the course (DFW). Of the 462 students who were excluded from the analytic sample, 440 (95.2%) did not turn in HW1 and 22 (4.8%) turned in HW1 but no other homework assignment. Most of these students (77.3%, n = 357) received DFW grades, while 105 students (22.7%) passed the course. No further information about these students is available.

All courses followed a uniform syllabus and used the same instructional materials. In addition to the asynchronous online assignments that are the focus of this report, students completed three in-class projects as group work, either in person or on Zoom for the online synchronous course sections. The project assignments comprised oral presentations on psychological disorders (Schwartz et al. 2017), a role-play ethics activity (Rose et al. 2022), and poster presentations about hidden figures in psychology (Hurley 2021). These project assignments will not be discussed further in this report.

Table 1. Descriptives for student characteristics (N = 1061).
Student CharacteristicsDescriptive Statistics
Mean (SD), Range
Age19.6 years (3.7), 16–45
Frequency (%)
Gender
Female627 (59.1%)
Male413 (38.9%)
Another Gender Identity/Prefer to Self-describe9 (0.8%)
Prefer not to Respond12 (1.1%)
Race/ethnicity (not mutually exclusive)
White408 (38.5%)
Latinx, Chicanx, Hispanic, or Spanish origin310 (29.2%)
Black/African American189 (17.8%)
Asian/Asian American119 (11.2%)
Middle Eastern/North African95 (9.0%)
American Indian/Alaska Native13 (1.2%)
Native Hawaiian/Other Pacific Islander5 (0.5%)
Some other race23 (2.2%)
Prefer not to say/Unknown50 (4.7%)
Either parent attended college (Yes = 1)553 (52.1%)

Online Asynchronous Assignments

Noba textbook quizzes

Students were assigned 25 low-stakes quizzes, each comprising four multiple-choice questions taken from one of the learning modules included in the Noba Introductory Psychology collection (Diener Education Foundation n.d.). Students were allowed to use the Noba textbook (open-book) and could take each quiz up to three times, with the highest score out of the three attempts retained. In creating the test bank, we selected twelve questions per learning module, with four quiz questions randomly selected from the test bank on each attempt. We calculated a single quiz score for each student, indicating the percentage correct out of 100 questions (M = 85.8%, SD = 20.7%, Range = 0–100). Note here that just ten students (0.9% of the analytic sample) failed to complete any of the textbook quizzes and received scores of 0.

Qualtrics homework assignments

Online homework assignments were created using Qualtrics survey software by a team of course instructors, with links to each assignment posted to the learning management system. The homework assignments introduced students to scientific abstracts, database searches for psychology-relevant empirical literature through the college library website, how to interpret research questions and results from research in psychology, TED talks by prominent psychologists in the field (e.g. Carol Dweck, Dan Gilbert), and quantitative reasoning problems developed by the APA Presidential Task Force on Introductory Psychology Statistical Literacy (Neufeld et al. 2022). Students completed 83.3% of the nine Qualtrics assignments (M = 7.5, SD = 1.9, Range = 2–9).

Excel worksheet activities and content acquisition podcasts

Six of the nine HWs included screen-recorded demonstrations, also known as content acquisition podcasts (CAPs) (Kennedy et al. 2016), on how to use Excel to visualize and analyze psychology-relevant data. The CAPs were developed by the instructional team using Zoom for screen recording. Each Excel CAP provided step-by-step instruction on how to compute relevant summary statistics, such as means and standard deviations, create data visualizations, such as scatter plots and bar charts, and construct contingency tables to compare binary count data. The CAPs included subtitles that aligned with the verbal instructions to meet Universal Design standards (CAST, n.d.). For each of the six Excel data analysis and visualization activities, worksheets were prepared using datasets assembled by the instructional team or a publicly available dataset (Emerging Adulthood Measured at Multiple Institutions 2 [EAMMI-2]; Grahe et al. 2018). The datasets featured student responses to psychology-related questionnaires (e.g. Big 5 personality traits, compulsive internet use, mate preferences, sleep habits).

Students were required to upload their completed worksheets to the learning management system. All CAPs were also made available via the learning management system so that students could complete the Excel worksheet activities at a separate time, i.e., before or after completing the associated Qualtrics homework assignment. Table 2 provides a description for each of the six Excel worksheet activities, along with a screenshot of a completed assignment and links to the CAPs. Ideally, students’ final uploaded spreadsheets would match the examples provided in Table 2, with partial credit awarded for incomplete work. On average, students completed 61.7% of the six Excel worksheets (M = 3.7, SD = 2.3, Range = 0–6). In total, 188 students (17.8% of the analytic sample) failed to complete any Excel work.

Table 2. Descriptions of Excel worksheet activities and content acquisition podcasts.
Statistical conceptsScreenshot of completed Excel work
1 . Calculating Pearson correlations between scale scores for Big 5 personality traits and compulsive internet use; visualizing correlational data using a scatterplot with linear trend line.
Part 1: Visualize correlational data.
Part 2: Calculate correlations between variables.
Scatterplot demonstrating a positive, moderate, linear association between Neuroticism (y-axis) and Compulsive Internet Use (x-axis).
2. Calculating category counts for mate preference traits (Buss 1990); visualizing the frequencies of preferred traits reported by men and women in a bar graph.
Part 1: Calculate sum count
Part 2: Calculate percentages using fill down
Part 3: Visualize data in a bar chart
bar graph demonstrating the frequency counts of mate preference traits for males versus females in a series of listed trait categories.
3. Calculating Odds Ratios to investigate whether there is a higher likelihood of crime occurring in community lots with no “greening” efforts (Heinze et al. 2018).
Part 1: Recoding binary variables
Part 2: Tabulating contingencies using pivot tables
Part 3: Calculating Odds Ratios using formulas
screenshot of an Excel spreadsheet showing a 2-by-2 contingency table for counts of crime rate in maintained (greened) lots versus non-maintained (non-greened) lots.An odds ratio is tabulated off of the contingency tableThe odds ratio provides evidence for increased odds of crime for non-greened lots compared to greened lots.
4. Reverse scoring data from a Likert scale on growth mindset (Dweck 2006) and calculating averages for each respondent in the dataset.
Reverse score a scale and calculate averages
screenshot of an Excel spreadsheet containing Likert scale data from three growth mindset items (columns) for eight respondents (rows). A fourth column is added demonstrating how to reverse code one of the original growth mindset items, and a fifth column is added demonstrating how to average across the scale using the reverse score column for the one item that required reversed scoring. A grand average across the eight respondents is provided.
5. Using pivot tables to organize data from students’ self-reported sleep habits (measure adapted from the Pittsburgh Sleep Quality Index, Buysse et al. 1989); visualizing a normal distribution of average hours of sleep
Part 1: Calculating frequencies using pivot tables
Part 2: Visualizing properties of the normal distribution
normal distribution (bell-shaped curve) representing data on average hours of sleep in college students. The center of the distribution along the x-axis provides the average of the sleep-hours data, and also one, two, and three standard deviations from the mean are provided along the x-axis.
6. Using independent samples t-tests to compare groups on their attitudes towards risk avoidance behaviors as a marker of emerging adulthood (Grahe et al. 2018).
Part 1: Comparing male and female respondents using a t-test
Part 2: Comparing conservative and liberal respondents using a t-test
independent sample t-Test output demonstrating a statistically significant difference in Liberals versus Conservatives in their attitudes towards risk avoidance behaviors.

Embedded vocabulary quizzes

Five multiple-choice questions were added to each Qualtrics assignment as a formative assessment of content knowledge. There were forty-five questions in total—twenty-eight questions assessing knowledge of general psychology terminology and concepts and seventeen questions assessing knowledge of statistical terminology and concepts. Examples of general psychology items are: What stage of life is characterized by a feeling in-betweenness, where people do not feel like children but don’t necessarily feel like adults either? And Istvan is well liked by his friends because he is trusting, considerate, compassionate, loyal, and easy to get along with. Which of the Big 5 factors of personality would Istvan likely score high in? Examples of statistics items are: A statistical measure that indicates the extent to which two or more variables are related is called __________. Another example is: What does it mean when the p-value of a t-test comparing two groups is LESS than .05?

Students attempted an average of 37.3 (SD = 9.7, Range = 10–45) vocabulary questions (general psychology items: M = 23.8, SD = 5.5, Range = 7–28; statistics items: M = 13.5, SD = 4.5, Range = 1–17). Performance (accuracy) was scored by taking the proportion of correct answers out of the total number of questions attempted. We excluded skipped quizzes so that our measure of comprehension accuracy would be independent of the number of Qualtrics homework assignments completed. Accuracy was similar across item types (general psychology items: M = 85.7%, SD = 16.8%, Range = 21.5%–100%; statistics items: M = 86.8%, SD = 19.3%, Range = 0%–100%).1 Accuracy across general psychology and the statistics vocabulary items was strongly positively correlated, r(1059) = .77, p < .001. Therefore, we combined both item types to create composite scores for the embedded vocabulary quizzes (M = 86.0%, SD = 16.7%, Range = 22.1%–100%).

Additional Measures

Demographic questionnaire

HW1 contained a demographic questionnaire and other measures used as predictor variables in the analyses. These included measures of statistical knowledge, statistics anxiety, self-efficacy for online learning, as well as questions about Microsoft 365 software installation and the device used to complete the assignment.

Statistical knowledge test

To create a formative measure of statistics knowledge, the instructional team adapted twelve items from publicly available AP Statistics practice tests and the New York Times “What's Going On in This Graph?” column, developed by the New York Times in partnership with the American Statistical Association (The Learning Network 2021). The multiple-choice problems used data visualization (i.e., graphs, tables) and/or word prompts to query knowledge of target statistical concepts. Following each problem, students selected an answer from four response options. This test exhibited low internal consistency (Cronbach’s 𝛼 = .57); however, given the wide range of statistical concepts covered, this was not unexpected (i.e., a knowledge test covering diverse topics is not expected to perform like a scale). Table 3 provides item-level statistics organized by difficulty; note that the target concepts are largely mapped onto the content of the Excel worksheet activities. Students exhibited a wide range of statistical knowledge on the test, with an overall average of 50.7% correct (SD = 19.4%, Range = 0.0%–100.0%).

Table 3. Item level statistics for the Statistical Knowledge Test in order of difficulty (N = 1061)
Target ConceptPercent Correct𝛼 if item deleted
Calculating a median 77.3% .55
Means and standard deviations77.3% .53
Calculating a mean 76.7% .52
Scatterplots and different associations between variables56.8% .53
Scatterplots and correlation coefficients 50.5% .54
Histograms49.7% .54
Uses of bar-graphs49.2% .55
Normal distribution45.3% .56
Comparing different distributions 43.6% .54
Independent sample t-tests and alpha levels39.4% .58
Types of t-tests22.6% .56
Features of a normal distribution20.5% .57

Statistics anxiety scale

The six-item statistics anxiety scale (STARS) (adapted from Teman 2013) asked students to rate their anxiety towards a statistics-related activity (three items) or their agreement with a statement about doing statistics (three items); see Table 4 for the list of items. Each set of items used a 5-point Likert scale (for anxiety ratings: 1 = Not at all Anxious, 5 = Extremely Anxious; for agreement statements: 1 = Strongly Disagree, 5 = Strongly Agree). Students on average were slightly anxious about statistics (M = 2.27, SD = 0.96, Range = 1–5) and had moderate agreement with statements about their statistics skills (M = 2.81, SD = 1.09, Range = 1–5). We created a composite measure of the two subscales (M = 2.54, SD = 0.81, 𝛼 = .78).

Table 4. Item level statistics for the statistics anxiety scale (STARS).2
ItemMeanSD𝛼 if item deleted
1Reading a journal article that includes some statistical analyses2.141.04.73
2Trying to understand the statistical analyses described in the abstract of a journal article2.411.06.73
3Interpreting the meaning of a table in a journal article2.261.05.73
4I have not had math for a long time. I know I will have problems getting through statistics3.061.36.76
5I cannot even understand seventh- and eighth-grade math; how can I possibly do statistics?2.371.26.76
6Since I have never enjoyed math I do not see how I can enjoy statistics3.011.33.77

Self-efficacy for online learning

We administered an adapted version of Shen et al.’s (2013) self-efficacy scale for online learning. The adapted scale encompassed minor changes in wording to ensure that the items were appropriate for both in-person and online course sections. The scale consisted of fifteen items assessing students’ confidence in their ability to complete coursework (e.g. Complete the course with a good grade), use online technology (e.g. Download course materials from Blackboard), and interact with their instructor and peers (e.g. Seek help from my classmates when needed). Students responded using a 5-point Likert scale (1 = Not confident at all, 5 = Completely confident). Composite scores were created by averaging across the fifteen items (M = 3.88, SD = 0.68, 𝛼 = .93); see Table 5 for item-level statistics.

Table 5. Item level statistics for the self-efficacy for online learning scale.3
ItemMeanSD𝛼 if item deleted
How confident are you that you could do the following tasks?
1Complete the course with a good grade3.830.89.92
2Understand complex concepts3.440.97.92
3Overcome challenges that could negatively affect my learning3.560.94.92
4Successfully complete all of the required activities3.940.90.92
5Keep up with the course schedule3.830.94.92
6Create a plan to complete a given assignment3.800.93.92
7Willingly adapt my study habits to meet course expectations3.720.93.92
How confident are you that you could use online tools to complete the following tasks?
1Download course materials from Blackboard4.130.96.92
2Install software on my computer3.951.06.92
3Submit assignments through Blackboard4.220.91.92
4Send emails to my instructor4.290.87.92
How confident are you that you could interact with your instructor and classmates in the following ways?
1Ask my instructor questions4.020.95.92
2Inform my instructor when unexpected situations arise4.100.91.92
3Express my opinions in class respectfully3.851.08.92
4Seek help from my classmates when needed3.451.20.93

Device used

Students were asked to report the device they used to submit their coursework from the following options: desktop computer, laptop, smartphone, or tablet. Most students (n = 871, 82.1%) reported that they used a desktop computer or laptop while the remainder (n = 190, 17.9%) indicated that they used a smartphone or tablet for Qualtrics HW1.

Software installation and access

Students were asked to self-report their success in installing Microsoft Office 365. The majority (n = 939, 88.5%) reported that they were able to install Microsoft Office 365 on their device or already had access to Microsoft Office while the remaining students (n = 122, 11.5%) reported that they were unable to install the software at the start of the semester.

Results

Preliminary Analysis

We used linear mixed-effects models to examine predictors of Noba textbook quiz scores, embedded vocabulary quiz scores, Qualtrics homework completion, and Excel worksheet completion. All analyses were done in R version 4.4.0 (R Core Team 2024) using the lme4 package version 1.1–35.3 (Bates et al. 2015). To account for students nested within sections, course section was included in all models as a random effect. Preliminary analyses indicated that course section characteristics (in-person vs. online formats, regular vs. large enrollment) were unrelated to students’ inclusion in the analytic sample. That is, the probability of a student turning in Qualtrics HW1 and at least one other assignment did not vary by course section characteristics (in-person vs. online format: 𝛽 = –0.05, p = .89; regular vs. large course enrollment: 𝛽 = –0.15, p = .71).

Additionally, preliminary analyses indicated that course section characteristics (in-person vs. online formats, regular vs. large enrollment) were unrelated to the dependent variables (Noba textbook quiz scores, embedded vocabulary quiz scores, Qualtrics homework completion, and Excel worksheet completion). Course section characteristics were dropped from the final models. Further preliminary analyses indicated that using a computer versus an iPhone/tablet and students’ self-reported success in installing the Microsoft 365 suite at the start of the semester were unrelated to the dependent variables. These variables were also dropped from the final models. The final models are reported below, organized by dependent variable.

Noba Textbook Quiz Scores

We used the following variables to predict Noba textbook quiz scores (M accuracy = 85.8%, SD = 20.7%): statistical knowledge test scores, statistics anxiety scale scores, self-efficacy for online learning, and embedded vocabulary quiz scores. Table 6 presents the final model, which explained 9.7% of the variance in Noba quiz scores. Students with higher scores on the statistical knowledge test at the beginning of the semester did better on the quizzes (𝛽 = 14.32, p < .001), as did students with higher self-efficacy for online learning (𝛽 = 2.12, p = .02). Higher accuracy on the vocabulary quizzes embedded in the Qualtrics homework assignments was associated with higher scores on the Noba textbook quizzes (𝛽 = 18.77, p < .001).

Table 6. Regression model predicting Noba textbook quiz scores (N = 1061).4
Variable𝛽SEtp
Intercept53.9136.0388.929< .001
Statistical Knowledge Test14.3203.3234.309< .001
Statistics Anxiety Scale0.0690.8140.084.933
Self-efficacy for Online Learning2.1220.9402.258.024
Embedded Vocabulary Quiz Score18.7723.7055.066<.001

Embedded Vocabulary Quizzes

We used the following variables to predict accuracy on the embedded vocabulary quizzes (M = 86.0%, SD = 16.7%): statistical knowledge test scores, statistics anxiety scale scores, self-efficacy for online learning, and Noba textbook quiz scores. Table 7 presents the final model, which explains 5.4% of the variance in embedded quiz scores. Students with higher scores on the statistical knowledge test at the beginning of the semester exhibited higher vocabulary comprehension (𝛽 = 0.11, p < .001). Mirroring the previous model, higher scores on Noba quiz scores were associated with higher scores on the embedded vocabulary quizzes (𝛽 = .001, p < .001).

Table 7. Regression model predicting embedded vocabulary quiz scores (N = 1061).5
Variable𝛽SEtp
Intercept0.7710.04517.108< .001
Statistical Knowledge Test0.1060.0273.878< .001
Statistics Anxiety Scale–0.0110.007–1.610.108
Self-efficacy for Online Learning–0.0110.008–1.367.172
Noba Textbook Quiz Score0.0010.0004.867< .001

Qualtrics Homework Assignments

On average, students completed 7.5 out of 9 Qualtrics homework assignments (SD = 1.9, Range = 2–9). Table 8 presents the final model which explained 11.8% of the variance in homework completion. Students with higher scores on the statistical knowledge test at the start of the semester (𝛽 = 0.13, p < .001), and those with higher scores on the embedded vocabulary quiz scores (𝛽 = 0.18, p < .001) completed more of the Qualtrics assignments.

Table 8. Regression model predicting Qualtrics Homework submission (N = 1061).6
Variable𝛽SEtp
Intercept0.5470.0638.621< .001
Statistical Knowledge Test0.1270.0353.677< .001
Statistics Anxiety Scale0.0050.0080.607.544
Self-efficacy for Online Learning0.0130.0101.326.185
Embedded Vocabulary Quiz Score0.1790.0394.637< .001

Excel Worksheets

On average, students completed 3.7 out of 6 Excel worksheets (SD = 2.3, Range = 0–6). Table 9 presents the final model explaining 10.3% of the variance in Excel worksheet completion. Students with higher scores on the statistical knowledge test completed more Excel worksheets (𝛽 = 1.87, p < .001), as did students with higher self-efficacy for online learning (𝛽 = 0.37, p < .001). Students with higher accuracy on the embedded vocabulary quizzes (𝛽 = 1.70, p < .001) also completed more Excel work.

Table 9. Regression model predicting Excel submission (N = 1061).7
Variable𝛽SEtp
Intercept–0.1010.670–0.151.880
Statistical Knowledge Test1.8670.3695.057< .001
Statistics Anxiety Scale–0.0110.090–0.118.906
Self-efficacy for Online Learning0.3720.1043.562< .001
Embedded Vocabulary Quiz Score1.7040.4124.139< .001

Discussion

The present study reports on a departmental effort to teach quantitative reasoning in Introductory Psychology (PSY100), in alignment with the APA Guidelines for the Undergraduate Major (APA 2023) and the APA Presidential Task Force on Introductory Psychology Statistical Literacy (Neufeld et al. 2022). To promote psychology as a data-driven science, PSY100 students were assigned a series of online homeworks via Qualtrics survey software and accompanying Excel worksheet activities using CAPs to guide students in analyzing and visualizing psychology-relevant datasets. Our use of CAPs served to scaffold active learning by providing step-by-step screencast tutorials for each assignment. We used embedded quizzes as formative assessments of both statistics and general psychology vocabulary comprehension to gauge students’ grasp of lesson content throughout the semester. We also used low-stakes quizzes to assess student comprehension of each of the Noba textbook modules and track their continued engagement with course material. Student performance on both the embedded vocabulary quizzes (M = 86.0%, SD = 16.7%, Range = 22.1%–100%) and the Noba textbook quizzes (M = 85.8%, SD = 20.7%, Range = 0–100) indicated that the curriculum was developmentally appropriate. Moreover, students with higher scores on the Noba textbook quizzes also tended to have higher scores on the embedded vocabulary quizzes, which in turn was associated with their completing more of the Qualtrics homework assignments and Excel assignments over the course of the semester. These results are in keeping with the value of formative assessments not only for gauging and monitoring students’ comprehension of lessons, but also for providing critical feedback to students that may encourage academic persistence (Clark 2012; Grose-Fifer et al. 2019). Students’ high scores on the Noba quizzes underscore that it is possible to engage students in reading their textbooks, which goes against the countervailing opinion that college students do not read anymore (McMurtrie 2024).

Both the Noba textbook quiz scores and the embedded vocabulary quiz scores were associated with prior statistical knowledge. That is, higher scores on the statistical knowledge test at the start of the semester were associated with better comprehension of the textbook modules and statistical and general psychology terminology from the lessons. Higher scores on the statistical knowledge test were also associated with completing more Qualtrics homework assignments and Excel worksheets. Students entered the course with a wide variability in their statistical knowledge, with performance on our statistical knowledge test ranged from 0–100%. Although the statistical knowledge test utilized problems taken from The New York Times and other sources designed for high school students, our college student sample averaged only 50.7% correct. This appears to be in keeping with Cook and Fukawa-Connelly (2015), who reported idiosyncratic patterns of statistical knowledge among first-year college students, with some correct understandings existing alongside various misconceptions and/or a general lack of knowledge. Our findings reflect broader concerns about declining mathematical knowledge of U.S. students at college entry (Carpenter and Kirk 2017; Er 2018), which may have worsened further since the COVID-19 pandemic (Toness and Luyre 2022). Students with insufficient baseline knowledge in mathematics may be less prepared to complete assignments that involve quantitative reasoning and college-level coursework in general.

In contrast to statistics knowledge, statistics anxiety was unrelated to learning outcomes. In other words, despite wide variation in statistics anxiety scores, statistics anxiety did not impede students’ success and engagement with the online homeworks over the course of the semester. Although this result appears to be inconsistent with prior research linking higher math anxiety with either lower (Pekrun 2006; Pekrun et al. 2017) or higher academic achievement (Brooks et al. 2024), it matches other null findings (Trassi et al. 2022). In the present study, the lack of a relation with statistics anxiety could be attributed to the nature of the homework assignments, which were intentionally low-stakes and graded solely on the basis of completion. Assigning low-stakes homework, where students have perceived control over their learning, may mitigate effects of high statistics anxiety on course outcomes (Pekrun et al. 2017). Chang and Beilock (2016) emphasize the need for interventions to reduce the linkage between math anxiety and academic achievement. Using low-stakes assignments to increase self-efficacy and perceived control of learning could be one example of an effective intervention. That said, the varied findings in the literature suggest that the relation between statistics anxiety and academic achievement may be complex and mediated by other factors (Finney et al. 2003; Trassi et al. 2022). It would be fruitful for future research to examine whether statistics anxiety has a stronger association with learning outcomes if assignments are graded based on correctness rather than completion.

As part of our department-wide effort to revise the PSY100 curriculum, we packaged course materials as self-paced, asynchronous online assignments that would be suitable for students enrolled in either in-person or fully online course sections. In contrast with prior findings indicating worse outcomes for students who rely on mobile devices for completing online assignments (Brooks et al. 2024; Gravelle et al. 2024), our preliminary analyses found no significant differences in learning outcomes for students who completed HW1 using a hand-held device as compared to a laptop/desktop computer. Though 18% of students reported completing the assignment on a mobile device, repeated emphasis on the importance of using a laptop/desktop may have encouraged students to use computers for future assignments. Unfortunately, for the present study, we did not track students’ uses of specific devices beyond HW1.

Notably, our preliminary analyses also found no effects of course modality or enrollment (regular vs. large) on outcome measures. These null results indicate that the curriculum was equally accessible across modalities and scalable to the large-format “lecture-based” PSY100 courses taught at many institutions (Long and Coldren 2006). Whereas course characteristics were unrelated to learning outcomes, students’ self-efficacy for online learning was associated with their obtaining higher scores on the Noba textbook quizzes and with submitting more of the Excel worksheets. Although not all studies have found a link between self-efficacy and course outcomes (Gravelle et al. 2024), the current results align with several studies that have found this positive association (Alqurashi 2016; Gurung and Stone 2023; Tsai et al. 2011). Self-efficacy may just be one piece to the puzzle in getting students to engage in online asynchronous learning assignments and establishing learning presence in students, alongside a collection of other student beliefs and behaviors found to impact students’ engagement with online coursework (Shea and Bidjerano 2010).

In addition to self-efficacy, Shea and Bidjerano (2010) emphasize that attendant effort is an integral piece of the learning presence needed to thrive in an online learning environment. Despite the positive learning outcomes for students who engaged with the asynchronous online course materials, we faced a large number of non-responding students who failed to submit the online asynchronous assignments. That is, of 1523 students enrolled in PSY100, 28.9% did not complete even the first assignment, indicating a decision not to attend or engage with the curriculum from the outset of the semester. Most of these students received a grade of D, F, or withdrawal (DFW), indicating that failure to engage with coursework early on is prognostic of a negative course outcome. The observed non-response rate aligns with reports of chronic absenteeism in college courses, which increased dramatically in the aftermath of the COVID-19 pandemic, especially at non-selective minority-serving institutions (Bulman and Fairlie 2022; National Student Clearinghouse Research Center 2022, 2024). Though beyond the scope of the present study, we acknowledge the myriad social and economic factors contributing to college students’ relationships to higher education institutions and affecting their academic trajectories. These may include limited access to computers and high speed internet and other technology barriers (Jaggars et al. 2021), family and work obligations (Perna 2010; Porchea et al. 2010), food and housing insecurity (Ilieva et al. 2018), and systemic racism and social injustice (Williams et al., 2022). These factors create a cumulative burden on students’ self-efficacy for academic work and their sense of belonging in college (DeFreitas and Rin 2013), and should be examined in future research. For example, our self-efficacy scale was limited in its focus on online learning, and did not allow us to consider how students’ identities and lived experiences may have influenced their perceptions as college students. Future studies should include self-report measures of students’ college experiences, including qualitative measures of their sense of community and connection to instructors and peers (Garrison and Arbaugh 2007), and personal and school-related challenges (Gravelle et al. 2024).

Finding ways to engage first-year college students effectively, in the hope that they will find college coursework to be relevant and worth pursuing, remains a challenge (Belkin 2021; Tough 2023). Many students today approach college with ambivalence. Understanding why students fail to engage with college coursework may require a shift away from individual level-factors like self-efficacy to social and societal factors affecting academic persistence (Roberts et al. 2025; Zengilowski et al. 2023). Our findings indicate that PSY100 students often “quit” the course without completing a single assignment. This suggests that strong connections need to be fostered at the start of the semester to have any impact on student retention. Efforts to promote social engagement with peers may help students establish a sense of belonging in higher education and aid in retention (Ahn and Davis 2023). Along these lines, our institution is exploring whether employing successful former students as peer leaders in PSY100 might improve learning outcomes (McCloskey et al. 2024). Promoting peer-to-peer interactions may help to build a sense of community and belonging in college courses, with benefits for both peer leaders and the students enrolled in the course (Stigmar 2016). That withstanding, co-curricular efforts to increase social supports for first-year college students may have limited effects on retention if the wider societal and systemic factors affecting students remain unaddressed (Zengilowski et al. 2023).

In conclusion, the present study demonstrates the feasibility of a department-wide effort to embed quantitative reasoning in a general education PSY100 course. Our curriculum provides a framework for instructors to rethink how they go about teaching general education courses to incorporate broad-based scientific and analytic skills valued in the workplace (NACE 2021) in course design. Key components of the curriculum included (1) self-paced quizzes to encourage close reading of the textbook, (2) homework assignments featuring library resources (e.g. Google Scholar), scientific abstracts, TED talks from prominent researchers, and embedded quizzes as formative assessments, and (3) hands-on Excel worksheet activities to introduce data analysis and visualization techniques. Further research is needed to address issues related to student apathy, absenteeism, and broader problems associated with declining enrollments in higher education. As a starting point, instructors should strive to take necessary steps to engage college students from the start of the semester in developing the analytic and scientific reasoning skills they need for college and for their future careers.

References

Adams, William C., Donna Lind Infeld, and Carli M. Wulff. 2013. “Statistical Software for Curriculum and Careers.” Journal of Public Affairs Education 19, no. 1 (Winter): 173–88. http://www.jstor.org/stable/23608939.

Ahn, Mi Young and Howard H. Davis. 2023. “Students’ Sense of Belonging and Their Socio-Economic Status in Higher Education: A Quantitative Approach.” Teaching in Higher Education 28, no.1: 136–49. https://doi.org/10.1080/13562517.2020.1778664.

Alfieri, Louis, Patricia J. Brooks, Naomi J. Aldrich, and Harriet R. Tenenbaum. 2011. “Does Discovery-Based Instruction Enhance Learning?” Journal of Educational Psychology 103, no. 1 (February): 1–18. https://doi.org/10.1037/a0021017.

Alqurashi, Emtinan. 2016. “Self-Efficacy in Online Learning Environments: A Literature Review.” Contemporary Issues in Education Research 9, no. 1: 45–52. https://doi.org/10.19030/cier.v9i1.9549.

American Psychological Association. 2023. “APA Guidelines for the Undergraduate Psychology Major Version 3.0.” American Psychological Association.https://www.apa.org/about/policy/undergraduate-psychology-major.

Bates, Douglas, Martin Mächler, Benjamin M. Bolker, and Steven C. Walker. 2015. “Fitting Linear Mixed-Effects Models Using lme4.” Journal of Statistical Software 67, no. 1: 1–48. https://doi.org/10.18637/jss.v067.i01.

Belkin, Douglas. 2021. “A Generation of American Men Give up on College: ‘I Just Feel Lost’.” The Wall Street Journal, September 6, 2021. https://www.wsj.com/articles/college-university-fall-higher-education-men-women-enrollment-admissions-back-to-school-11630948233.

Breneiser, Jennifer E., Joshua S. Rodefer, and Jeremy R. Tost. 2018. “Using Tutorial Videos to Enhance the Learning of Statistics in an Online Undergraduate Psychology Course.” North American Journal of Psychology 20, no. 3: 751. Gale Academic OneFile. https://link.gale.com/apps/doc/A563457868/AONE?sid=bookmark-AONE&xid=683b29dd.

Brooks, Patricia J., C. Donnan Gravelle, Nicole M. Zapparrata, Elizabeth S. Che, Arshia K. Lodhi, Raoul Roberts, and Jessica E. Brodsky. 2024. “Redesigning the Introductory Psychology Course to Support Statistical Literacy at an Open-Admissions College.” Scholarship of Teaching and Learning in Psychology. Advance online publication. https://psycnet.apa.org/doi/10.1037/stl0000407.

Bulman, George and Robert Fairlie. 2022. “The Impact of COVID-19 on Community College Enrollment and Student Success: Evidence from California Administrative Data.” Education Finance and Policy 17, no. 4: 745–64. https://doi.org/10.1162/edfp_a_00384.

Buysse, Daniel J., Charles F. Reynolds III, Timothy H. Monk, Susan R. Berman, and David J. Kupfer. 1989. “The Pittsburgh Sleep Quality Index: A New Instrument for Psychiatric Practice and Research.” Psychiatry Research 28, no. 2 (May): 193–213. https://doi.org/10.1016/0165-1781(89)90047-4.

Buss, David M., Max Abbott, Alois Angleitner, Armen Asherian, Angela Biaggio, Angel Blanco-Villasenor, M. Bruchon-Schweitzer et al. 1990. “International Preferences in Selecting Mates: A Study of 37 Cultures.” Journal of Cross-Cultural Psychology 21, no. 1: 5–47. https://doi.org/10.1177/0022022190211001.

Carpenter, Thomas P. and Roger E. Kirk. 2017. “Are Psychology Students Getting Worse at Math?: Trends in the Math Skills of Psychology Statistics Students Across 21 Years.” Educational Studies 43, no. 3: 282–95. https://doi.org/10.1080/03055698.2016.1277132.

CAST. n.d. “The UDL Guidelines.” Assessed November 16, 2023. https://udlguidelines.cast.org/.

Chang, Hyesang and Sian L. Beilock. 2016. “The Math Anxiety-Math Performance Link and Its Relation to Individual and Environmental Factors: A Review of Current Behavioral and Psychophysiological Research.” Current Opinion in Behavioral Sciences 10 (August): 33–38. https://doi.org/10.1016/j.cobeha.2016.04.011.

Clark, Ian. 2012. “Formative Assessment: Assessment is for Self-Regulated Learning.” Educational Psychology Review 24: 205–49. https://doi.org/10.1007/s10648–011–9191–6.

Collins, Kate, Gerard Dooley, and Orna O’Brien. 2022. “A Reflection on Evolving Student Support in a Post-Pandemic Higher Education Environment.” Journal of Applied Learning and Teaching 5, no. 2: 42–50. https://doi.org/10.37074/jalt.2022.5.2.6.

Colvard, Nicholas B., C. Edward Watson, and Hyojin Park. 2018. “The Impact of Open Educational Resources on Various Student Success Metrics.” International Journal of Teaching and Learning in Higher Education 30, no. 2: 262–76. http://www.isetl.org/ijtlhe/pdf/IJTLHE3386.pdf.

Cook, Samuel A. and Timothy Fukawa-Connelly. 2016. “The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics.” International Journal of Mathematical Education in Science and Technology 47, no. 2: 167–84. https://doi.org/10.1080/0020739X.2015.1060642.

DeFreitas, Stacie Craft, and Anne Rinn. 2013. "Academic achievement in first generation college students: The role of academic self-concept." Journal of the Scholarship of Teaching and Learning: 57–67. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/2161.

Diener Education Foundation. n.d. “Introduction to Psychology: The Full Noba Collection.” Noba Project.https://nobaproject.com/textbooks/introduction-to-psychology-the-full-noba-collection.

Dweck, Carol S. 2006. Mindset: The New Psychology of Success. Random House.

El Mansour, Bassou. and Davison M Mupinga. 2007. “Students’ Positive and Negative Experiences in Hybrid and Online Classes.” College Student Journal 41, no. 1: 242–48. https://eric.ed.gov/?id=EJ765422.

Er, Sidika Nihan. 2018. “Mathematics Readiness of First-Year College Students and Missing Necessary Skills: Perspectives of Mathematics Faculty.” Journal of Further and Higher Education 42, no. 7: 937–52. https://doi.org/10.1080/0309877X.2017.1332354.

Finney, Sara J. and Gregory Schraw. 2003. “Self-Efficacy Beliefs in College Statistics Courses.” Contemporary Educational Psychology 28, no. 2 (April): 161–86. https://doi.org/10.1016/S0361-476X(02)00015-2.

Foley, Alana E., Julianne B. Herts, Francesca Borgonovi, Sonia Guerriero, Susan C. Levine, and Sian L. Beilock. 2017. “The Math Anxiety-Performance Link: A Global Phenomenon.” Current Directions in Psychological Science 26, no. 1: 52–58. https://doi.org/10.1177/0963721416672463.

Freeman, Scott, Sarah L. Eddy, Miles McDonough, Michelle K. Smith, Nnadozie Okoroafor, Hannah Jordt, and Mary Pat Wenderoth. 2014. “Active Learning Increases Student Performance in Science, Engineering, and Mathematics.” Proceedings of the National Academy of Sciences 111, no. 23: 8410–15. https://doi.org/10.1073/pnas.1319030111.

Grahe, Jon, Caitlin Faas, Holly M. Chalk, Hayley M. Skulborstad, Christopher Barlett, Justin W. Peer, Anthony Hermann et al. 2018. “Emerging Adulthood Measured at Multiple Institutions 2: The Next Generation (EAMMi2).” Open Science Framework Repository. Last modified February 2, 2021. https://osf.io/te54b/.

Gravelle, C. Donnan, Jessica E. Brodsky, Arshia K. Lodhi, Nicole M. Zapparrata, Elizabeth S. Che, Teresa M. Ober, and Patricia J. Brooks. 2024. “Remote Online Learning Outcomes in Introductory Psychology During the COVID-19 Pandemic.” Scholarship of Teaching and Learning in Psychology 10, no. 4: 442–70. https://doi.org/10.1037/stl0000325.

Grose-Fifer, Jillian., Patricia J. Brooks, and Maureen O'Connor. 2019. Teaching Psychology: An Evidence-Based Approach. John Wiley & Sons.

Gurung, Regan A. R., Jana Hackathorn, Carolyn Enns, Susan Frantz, John T. Cacioppo, Trudy Loop, and James E. Freeman. 2016. “Strengthening Introductory Psychology: A New Model for Teaching the Introductory Course.” American Psychologist 71, no. 2: 112–24. https://doi.org/10.1037/a0040012.

Gurung, Regan A. R. and Arianna M. Stone. 2023. “You Can’t Always Get What You Want and It Hurts: Learning During the Pandemic.” Scholarship of Teaching and Learning in Psychology 9, no. 3: 264–75. https://doi.org/10.1037/stl0000236.

Heinze, Justin E., Allison Krusky‐Morey, Kevin J. Vagi, Thomas M. Reischl, Susan Franzen, Natalie K. Pruett, Rebecca M. Cunningham, and Marc A. Zimmerman. 2018. “Busy Streets Theory: The Effects Of Community‐Engaged Greening On Violence.” American Journal of Community Psychology 62, no. 1–2: 101–09. https://doi.org/10.1002/ajcp.12270.

Hurley, Roderick. 2021. “Introductory Psychology Hidden Figures Poster—Online Group Presentation.” CUNY Academic Works: Open Educational Resources 17. https://academicworks.cuny.edu/si_oers/17.

Ilieva, Rositsa T., Tanzina Ahmed, and Anita Yan. "Hungry Minds: Investigating the Food Insecurity of Minority Community College Students." Journal of Public Affairs 19, no. 3 (2019): e1891. https:doi.org/10.1002/pa.1891.

Jaggars, Shanna S., Benjamin A. Motz, Marcos D. Rivera, Andrew Heckler, Joshua D. Quick, Elizabeth A. Hance, and Caroline Karwisch. 2021. "The Digital Divide among College Students: Lessons Learned from the COVID-19 Emergency Transition. Policy Report." Midwestern Higher Education Compact. Retrieved from https://files.eric.ed.gov/fulltext/ED611284.pdf.

Johannssen, Arne, Nataliya Chukhrova, Friederike Schmal, and Kevin Stabenow. 2021. “Statistical Literacy—Misuse of Statistics and Its Consequences.” Journal of Statistics and Data Science Education 29, no. 1: 54–62. https://doi.org/10.1080/10691898.2020.1860727.

Kennedy, Michael J., Shanna Eisner Hirsch, Sarah E. Dillon, Lindsey Rabideaux, Kathryn D. Alves, and Melissa K. Driver. 2016. “Using Content Acquisition Podcasts to Increase Student Knowledge and to Reduce Perceived Cognitive Load.” Teaching of Psychology 43, no. 2 (April): 153–58. https://doi.org/10.1177/0098628316636295.

Lilienfeld, Scott O., Steven Jay Lynn, John Ruscio, and Barry L. Beyerstein. 2009. 50 Great Myths of Popular Psychology: Shattering Widespread Misconceptions about Human Behavior. John Wiley & Sons.

Lloyd, Steven A. and Chuck L. Robertson. 2012. “Screencast Tutorials Enhance Student Learning of Statistics.” Teaching of Psychology 39, no. 1 (January): 67–71. https://doi.org/10.1177/0098628311430640.

Long, Holly E. and Jeffrey T. Coldren. 2006. “Interpersonal Influences in Large Lecture-Based Classes: A Socioinstructional Perspective.” College Teaching 54, no. 2 (Spring): 237–43. http://www.jstor.org/stable/27559273.

Lutsky, Neil. 2006. “Teaching Quantitative Reasoning.” APS Observer 19, March 1, 2006. https://www.psychologicalscience.org/observer/teaching-quantitative-reasoning?es=true&es=true.

McCloskey, Daniel, Patricia J. Brooks, and Kathleen Cumiskey. December 5, 2024. “Using Peer-Enhanced Blockchain-Based Learning Environments to Promote Student Engagement and Retention,” paper presented at the 2024 CUNY IT Conference, New York, NY.

McMurtrie, Beth. 2024, May 9. “Is this the End of Reading?” The Chronicle of Higher Education.https://www.chronicle.com/article/is-this-the-end-of-reading.

National Association of Colleges and Employers. 2021. “Career Readiness: Competencies for a Career-Ready Workforce.” https://www.naceweb.org/career-readiness/competencies/career-readiness-defined/.

National Student Clearinghouse Research Center. Completing College. 2022. https://nscresearchcenter.org/wp-content/uploads/Completions_Report_2021.pdf.

National Student Clearinghouse Research Center. Persistence and Retention. 2024. https://nscresearchcenter.org/persistence-retention/.

Neufeld, Garth, Samantha Estrada Aguilera, Kelly Goedert, Janet Peters, V. N. Vimal Rao, Viji Sathy, Tamarah Smith, and Jessica Hartnett. 2022. “Statistical Literacy, Reasoning, and Thinking: Guidelines 2.0.” Society for the Teaching of Psychology. https://teachpsych.org/page-1863179.

Ng, Wan. 2012. "Can We Teach Digital Natives Digital Literacy?" Computers & Education 59, no. 3: 1065–78. https://doi.org/10.1016/j.compedu.2012.04.016.

Ozgur, Ceyhun, Michelle Kleckner, and Yang Li. 2015. “Selection of Statistical Software for Solving Big Data Problems: A Guide for Businesses, Students, and Universities.” SAGE Open 5, no. 2 (April-June). https://doi.org/10.1177/2158244015584379.

Pekrun, Reinhard. 2006. “The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice.” Educational Psychology Review 18: 315–41. http://doi.org/10.1007/s10648-006-9029-9.

Pekrun, Reinhard, Stephanie Lichtenfeld, Herbert W. Marsh, Kou Murayama, and Thomas Goetz. 2017. “Achievement Emotions and Academic Performance: Longitudinal Models of Reciprocal Effects.” Child Development 88, no. 5 (September/October): 1653–70. https://doi.org/10.1111/cdev.12704.

Perna, Laura. W. 2010. “Understanding the Working College Student: New Research and its Implications for Policy and Practice.” Stylus Publishing. https://eric.ed.gov/?id=ED515052.

Porchea, Sameano F., Jeff Allen, Steve Robbins, and Richard P. Phelps. 2010. “Predictors of Long-term Enrollment and Degree Outcomes for Community College Students: Integrating Academic, Psychosocial, Socio-demographic, and Situational Factors.” The Journal of Higher Education 81, no. 6: 680–708. https://doi.org/10.1080/00221546.2010.11779077.

Powers, Kasey L., Patricia J. Brooks, Magdalena Galazyn, and Seamus Donnelly. 2016. “Testing the Efficacy of MyPsychLab to Replace Traditional Instruction in a Hybrid Course.” Psychology Learning & Teaching 15, no. 1 (March): 6–30. https://doi.org/10.1177/1475725716636514.

R Core Team (2024). R: A Language And Environment For Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/.

Ridgway, Jim, James Nicholson, and David Stern. 2017. “Statistics Education in a Post-Truth Era.” In Teaching Statistics in a Data Rich World Proceedings of the Satellite Conference of the International Association for Statistical Education (IASE). https://iase-web.org/documents/papers/sat2017/IASE2017%20Satellite%20N56_RIDGWAY.pdf.

Roberts, Raoul A., C. Donnan Gravelle, Elizabeth S. Che, Nicolas Zapparrata, Arshia K. Lodhi, and Patricia J. Brooks. 2025. “The Limited Role of Students’ Intrapersonal Beliefs and Skills in Predicting Learning Outcomes in an Online Introductory Psychology Course.” Under review.

Rose, Maya C., Jessica E. Brodsky, Elizabeth S. Che, & Patricia J. Brooks. 2022. “Teaching about systemic ethical misconduct increases awareness of ethical principles: A replication and extension of Grose-Fifer’s (2017) Tuskegee roleplay activity.” Teaching of Psychology 49, no. 3: 199–205. https://doi.org/10.1177/00986283211015981.

Schwartz, Anna M., Kasey L. Powers, Magdalena Galazyn, and Patricia J. Brooks. 2017. “Crowdsourcing course preparation strengthens teaching through collaboration.” In How We Teach Now: The GSTA Guide to Student-Centered Teaching, edited by Rita Obeid, Anna M. Schwartz, Christina Shane-Simpson, and Patricia J. Brooks, 69–82. Society for the Teaching of Psychology. http://teachpsych.org/ebooks/howweteachnow.

Shea, Peter and Temi Bidjerano. 2010. “Learning Presence: Towards a Theory of Self-Efficacy, Self-Regulation, and the Development of a Communities of Inquiry in Online and Blended Learning Environments.” Computers & Education 55: 1721–31. https://doi.org/10.1016/j.compedu.2010.07.017.

Shen, Demei, Moon-Heum Cho, Chia-Lin Tsai, & Rose Marra. 2013. “Unpacking Online Learning Experiences: Online Learning Self-Efficacy and Learning Satisfaction.” The Internet and Higher Education 19: 10–17. https://doi.org/10.1016/j.iheduc.2013.04.001.

Stigmar, Martin. 2016. “Peer-to-Peer Teaching in Higher Education: A Critical Literature Review.” Mentoring & Tutoring: Partnership in Learning 24, no. 2: 124–36. https://doi.org/10.1080/13611267.2016.1178963.

Teman, Eric D. 2013. “A Rasch Analysis of the Statistical Anxiety Rating Scale.” Journal of Applied Measurement 14, no. 4: 414–34. https://pubmed.ncbi.nlm.nih.gov/24064581/.

The Learning Network. 2021. “Introduction to ‘What’s Going on in This Graph?’” The New York Times, July 28, 2021. https://www.nytimes.com/2021/07/28/learning/introduction-to-whats-going-on-in-this-graph.html

Toness, Bianca Vázquez and Sharon Lurye. 2022. “Massive Learning Setbacks Show Covid’s Sweeping Toll on Kids.” The Hechinger Report, October 28, 2022. https://hechingerreport.org/massive-learning-setbacks-show-covids-sweeping-toll-on-kids/.

Tough, Paul. 2023. “Americans are Losing Faith in the Value of College. Whose Fault is That?” The New York Times, September 5 2023. https://www.nytimes.com/2023/09/05/magazine/college-worth-price.html.

Trassi, Angélica P., Sophie J. Leonard, Larissa D. Rodrigues, Jose A. Rodas, and Flávia H. Santos. 2022. “Mediating Factors of Statistics Anxiety in University Students: A Systematic Review and Meta‐Analysis.” Annals of the New York Academy of Sciences 1512, no. 1 (June): 76–97. https://doi.org/10.1111/nyas.14746.

Tsai, Chin-Chung, Shih-Chyueh Chuang, Jyh-Chong Liang, and Meng-Jung Tsai. 2011. “Self-Efficacy in Internet-Based Learning Environments: A Literature Review.” Educational Technology and Society 14, no. 4: 222–40. https://www.jstor.org/stable/10.2307/jeductechsoci.14.4.222.

Williams, Tiffany R., Tanesha L. Walker, and Whitney N. Wyatt. 2022. "Conceptualizing Racism Through a Systemic Trauma Lens: Impacts on Black College Students." Journal of Psychotherapy Integration 32, no. 1: 49–63. https://doi.org/10.1037/int0000206.

Zapparrata, Nicolas, C. Donnan Gravelle, Elizabeth S. Che., Arshia K. Lodhi, Raoul Roberts, Peter J. Johnson, Riya Anjaria, and Patricia J. Brooks. 2025. “Supplemental Materials: Fostering Quantitative Reasoning in Introductory Psychology through Asynchronous Assignments Featuring Low-Stakes Quizzes, Data Analysis, and Visualization Activities.” Open Science Framework Repository. https://osf.io/mju6z/.

Zeidner, Moshe. 1991. “Statistics and Mathematics Anxiety in Social Science Students: Some Interesting Parallels.” British Journal of Educational Psychology 61, no. 3 (November): 319–28. https://doi.org/10.1111/j.2044–8279.1991.tb00989.x.

Zengilowski, Allison, Irum Maqbool, Surya Pratap Deka, Jesse C. Niebaum, Diego Placido, Benjamin Katz, Priti Shah, and Yuko Munakata. 2023. "Overemphasizing Individual Differences and Overlooking Systemic Factors Reinforces Educational Inequality." npj Science of Learning 8, no. 1: 13. https://doi.org/10.1038/s41539-023-00164-z.

Acknowledgements

The authors thank Jessica Brodsky, Katie Cumiskey, and Dan McCloskey for their support in course design and implementation of this project. Curriculum development has been supported by NSF Grant 2318196 awarded to McCloskey, Brooks, and Cumiskey.

About the Authors

Nicolas Zapparrata PhD is an Educational Psychologist with a specialization in quantitative methods and statistical modeling. His research interests include the adoption and investigation of robust statistical modeling techniques across various applications in both Psychology and Education. Some examples include the application of statistical modeling techniques to the broad areas of meta-analysis, cognitive processing, and neurodevelopmental disorders. He is also interested in investigating student learning outcomes in higher-education, promoting statistical literacy through course curriculum/design, and facilitating active learning in students through his instructional approaches.

C. Donnan Gravelle is a PhD. candidate at the CUNY Graduate Center and an adjunct instructor of statistics in the Department of Psychology at the College of Staten Island CUNY. His primary research interests involve applying advanced quantitative techniques to investigate topics of applied and theoretical concern. Recent projects have investigated learning outcomes of college students in general education coursework, critical activism and implicit outgroup bias, neurodevelopmental disorders (e.g. dyslexia, Developmental Language Disorder), network models of vocabulary knowledge.

Elizabeth S. Che has a PhD in Educational Psychology with a specialization in learning, development, and instruction from the CUNY Graduate Center. She is a postdoctoral researcher, adjunct assistant professor of research methods and statistics, and Introductory Psychology Coordinator in the Department of Psychology at the College of Staten Island. Her research interests relate broadly to language development, course assessment, and pedagogical techniques to facilitate transferable skills.

Arshia K. Lodhi has a Masters of Science degree in Quantitative Methods in the Social Sciences from the CUNY Graduate Center (QMSS, 2024). She possesses extensive experience in program coordination, data management, and undergraduate teaching. Arshia has research experience across various educational institutions and community organizations where she conducted classroom-based and online research focusing on fact-checking strategies, language learning contributors, and an EEG experiment.

Peter J. Johnson is a PhD student in the Quantitative Methods in Educational Psychology program at the CUNY Graduate Center and an adjunct lecturer of statistics and research methods in the Department of Psychology at CUNY Hunter College. His primary research involves the development of effect sizes, novel models, and other statistical and quantitative methodology within the context of Item Response Theory. He also works applying quantitative methodology across various areas within the fields of Education and Psychology, including feedback, language learning, and civic engagement in children and young adults from non-democratic countries.

Raoul A. Roberts is a Developmental Psychology PhD student at the CUNY Graduate Center and an adjunct lecturer of Experimental Psychology in the Department of Psychology at CUNY Brooklyn College. His primary research focuses on how internal attributes such as academic self-efficacy and academic motivation are related to performance in Mathematics and Statistics at the tertiary level. He is also interested in researching the effectiveness of evidence-based instructional methods in his role as an educator at the K-12 level.

Riya Anjaria is a PhD student at the CUNY Graduate Center in Educational Psychology. Her research interests include: curriculum development, acquisition and development of vocabulary as well as integrating technology in the classroom. She holds a Masters degree in Learning, Cognition and Development from Rutgers University- New Brunswick. Riya’s prior experience includes teaching English as a Second Language in India and being a Research Assistant at Rutgers where she worked on developing curriculum to support students’ epistemic cognition and teachers’ pedagogy.

Patricia Brooks is Professor of Psychology at the College of Staten Island and the Executive Officer of the PhD Program in Psychology at the CUNY Graduate Center. Her research interests are in three broad areas: (1) individual differences in language learning over the lifespan, (2) the impact of technology on cognition and learning, and (3) the development of effective pedagogy to support diverse learners. She served as President of the Eastern Psychological Association (2024–2025) and presently serves as co-Editor in Chief of the journal Language Development Research (2025–2030).

  1. Note that only two students scored 0% on the statistics vocabulary item type—one of them attempted only one statistics quiz question, and the other attempted only four questions. ↑

  2. Items were administered as a 5-point Likert scale rating, with the first three items rating anxiety (1 = Not at all Anxious, 5 = Extremely Anxious) and second three items rating agreement (1 = Strongly Disagree, 5 = Strongly Agree). Item descriptives reflect the original scoring of the items. ↑

  3. Items were administered as a 5-point Likert scale rating (1 = Not at all confident, 5 = Completely confident). ↑

  4. Intraclass Correlation = .041. ↑

  5. Intraclass Correlation = 0.00. The intraclass correlation is exactly 0 because there was no variation among students from different sections. ↑

  6. Intraclass Correlation = .080. ↑

  7. Intraclass Correlation = .039. ↑

Attribution-NonCommercial-ShareAlike 4.0 International

This entry is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

Annotate

Articles
Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org