1. Introduction
Nearly all US college students are required to take at least one mathematics course, either as part their major area of study, or as a general education requirement. These requirements make intuitive sense; mathematical reasoning is a valued skill among employers [1], and many disciplines in the natural and social sciences rely on mathematics and statistics as a means of analyzing data. Beyond its practical utility, mathematics can be an important intellectual experience for students.
Despite this, college mathematics requirements are an obstacle for many college students, something to be endured rather than learned [2]. An analysis of nationally representative transcript data found that introductory or gateway college mathematics courses had the highest D-Fail-Withdraw (DFW) rate of any subject both among bachelor’s degree earners, and among all college students [3]. The problem of gateway mathematics course success has been acknowledged by researchers and math educators [4,5]. One response to the problem of low completion rates is Math Pathways—which routes students into relevant introductory mathematics courses based on their intended fields of study [6]. The Dana Center Math Pathways (DCMP) model was implemented at four community colleges studied through a Randomized Controlled Trial (RCT) study of 1411 students. This study found that students assigned to the DCMP intervention were 10 percentage points more likely to pass their first-year math course than those in the control group [7].
Another response to students’ challenges with mathematics has been to alter pedagogy. Active learning is a pedagogical approach which places emphasis on students’ collaborative work, often taking the form of peer discussions and group problem-solving, with teachers serving primarily as facilitators of student work [8]. Active learning has been shown to be positively associated with student learning and course success in STEM fields of study. Freeman et al. [9] conducted a meta-analysis of 225 studies, finding that active learning correlated with an average increase of 6 percent in exam scores, and that students in traditional lecture sections were 50% more likely to fail than those in active learning sections. A recent RCT of 811 students enrolled in Calculus I courses at a large public university showed that students who took calculus in an active learning modality had higher rates of course success and greater concept mastery [10]. Active learning has also been shown to be adaptable to courses involving computer software and programming [11].
The case study presented below is based on the author’s experience implementing an active learning model in an introductory statistics course for students majoring in sociology and related social sciences. As such it can be regarded as practitioner research [12], which has the goal of improving educational practice, rather than creating generalizable knowledge. The learning goals of this course, which is typical in undergraduate social science programs, include both understanding basic statistical concepts, and applying these concepts using statistical analysis software. The paper proceeds by describing the active learning model implemented in the course, and the data collected for the study. It then reports study findings from three sources and concludes with a discussion of findings based on the themes in the literature cited above.
2. Methods
2.1. Building Thinking Classrooms
The model of active learning used in the course is based on Peter Liljedahl’s [13] Building Thinking Classrooms in Mathematics—hereafter abbreviated as BTC. Liljedahl describes K-12 mathematics classrooms as normative spaces where students do more slacking, stalling, pretending, and mimicking teacher instructions than actual thinking. He relates the prevalence of these behaviors to institutional norms of classroom conduct (“studenting”) which have little to do with teaching and learning [13] (pp. 7–9). Based on classroom observations and short-duration experiments conducted over 15 years, Liljedahl describes fourteen teaching practices that can increase students’ thinking and engagement in mathematics classrooms [13] (p. 14). These practices range from “how students are placed into collaborative groups” and “how we answer questions”, to “what homework looks like” and “how we use formative assessment”.
The course presented in this paper implemented a limited version of BTC. Emphasis was placed on four features of Liljedahl’s [13] model:
In the early weeks, students were assigned to complete thinking tasks—engaging non-curricular problem-solving activities—to build their collaborative skills. Later group tasks combined introductory concepts in descriptive and inferential statistics with the basic elements of programming in STATA^{®} statistical software (Version 18).
Students were assigned to visibly random groups at the beginning of each class session.
The instructor and the teaching assistant emphasized answering student questions only when those questions promoted additional thinking.
Following each class session, the instructor led students in consolidation discussions to extract and refine key concepts from the work produced in student groups.
Liljedahl’s recommendations for using vertical non-permanent surfaces and organizing the physical classroom were modified for this course and are discussed below. As the semester progressed, the instructor made use of hints and extensions to maintain the flow of group work.
2.2. Study Setting
The course took place at a small liberal arts college in the northeastern United States. The course—Statistics for the Social Sciences—enrolled 20 students, most of whom were in their second or third year, and majoring in sociology or other social sciences. The instructor had taught the course in the prior year using a lecture-based approach, with a lab component on alternating days. Implementation of the active learning approach was supported by a fellowship at the college’s Center for Teaching and Learning. The redesigned course was implemented in Fall 2023. The course was facilitated with the aid of an undergraduate teaching assistant who was already familiar with the model of active learning being used in the course. Prior to the start of the semester, the instructor administered a brief survey to incoming students to assess their familiarity with both statistical concepts and software programming. Responses to this pre-course survey are not included in the study data.
2.3. Study Data and Analysis
The case study data comes from three sources. The first is student-produced group work that was collected after each class session. The second is an anonymous survey that students completed at the midpoint of the semester. The survey asked students about their prior experience with active learning, their level of preparation for this course relative to others, and their amount of learning in this course relative to others. All enrolled students completed the midpoint survey; the questionnaire is included as Supplementary Material Item S1. The final source of data is the anonymous course evaluations offered at the end of the course. These evaluations included closed questions an open-ended question where students could offer their views on the course. Nineteen out of twenty students completed the course evaluation.
Saved group work was analyzed by the instructor to understand how students engaged with thinking tasks and how this engagement changed over time. Responses to both the midpoint survey and final course evaluation were analyzed thematically, including open and closed ended questions.
The primary research questions that guided this study were:
Can the BTC model be effectively implemented in a Statistics for the Social Sciences, given its technology requirements and learning goals?
How will the grouping strategy and instructional format impact student learning?
How will the group strategy and instructional format impact student perceptions of the course?
3. Results
3.1. Implementation
In the summer of 2023, the author revised the syllabus and course assignments for Statistics for the Social Sciences using Building Thinking Classrooms as a guide. The data sets and problem sets from the course textbook were well-suited for inclusion in the redesigned course, though they had to be adapted as thinking tasks. The data sets covered topic areas familiar to undergraduate sociology students, and the problem sets included a range of difficulty levels—allowing for a distinction between core thinking tasks (which all groups completed in class) and extensions (for groups that finished the core tasks early).
One primary task in redesigning the course was crafting the 5-minute introductory scripts to set up the thinking tasks. Each script established the significance of the day’s content, defined key terms (e.g., standard deviation or t-test), and provided the basic STATA programming commands needed to complete the thinking tasks. An example script is provided as Supplementary Material Item S2.
The construction of these revised course materials was an important moment for reflection and for clarifying the learning goals of the course. Rather than teaching every nuance of the material covered in the course text, the 5-minute introductory script was a curated version of the material, focusing on the concepts needed to support the work students would be doing. While this initially seemed to be making the course less rigorous, much of the content that was not in the script was covered either in practice as the instructor circulated through the student groups, or in the consolidation discussion following group work.
3.2. Adapting Building Thinking Classrooms to Statistics for Social Sciences
A second challenge was to adapt BTC to the software programming component of the course. One of Liljedahl’s core recommendations is that group work take place on vertical non-permanent surfaces (i.e., white boards) so that students and the instructor can see the work as it unfolds. A related recommendation was that classroom furniture be arranged in clusters rather than rows of desks. But one primary learning goal of this course is for students to have programming experience in the STATA statistical software package, which required a computer lab with fixed seating. In a Supplemental text, Liljedahl [14] offers a guide for instructors to implement BTC for remote learning—which was a necessity for most if not all classrooms during the period of COVID-19 restrictions. Two chapters of this text offered insights for using BTC in classrooms with fixed seating and for synchronous virtual settings.
To adapt to a classroom environment with fixed seating in which working on computer screens was a necessity, the instructor set up virtual whiteboards using Google’s Jamboard application (Version 2.1). Since a full class had 20 students in seven groups, the instructor created seven identical Jamboard frames—with columns for STATA commands used, results obtained, and answers to task questions. Students were asked to put the names of group members on the top of their Jamboard as they started their thinking tasks—which also obviated the need to take attendance. To encourage what Liljedahl [13] (p. 47) calls “knowledge mobility”, groups were encouraged to look at the frames being produced by other groups. At the end of each session, the teaching assistant downloaded the Jamboard frames and then cleared their contents for the next class session.
3.3. Group Work Outcomes
As students walked in each day, they were placed in random groups of three using a visible process. At first, we used color-coded cards with group numbers. Then, we used randomizer application on a mobile device. When a group was full, we ignored any further random assignment to that group. If, because of absences, any group had a single member, we borrowed from a full group to make sure nobody was working alone. While Liljedahl [13] (p. 46) noted that some students in his experiments resisted random grouping, we did not observe any resistance in this class. Along with their random assignment, students were provided with the daily scripts as a handout, which also contained their group work tasks.
The first two class sessions of the semester were used to establish the norms of group work by having students work on engaging non-curricular thinking tasks [13] (p. 29). These initial tasks—which were adapted directly from Building Thinking Classrooms—aimed to help students become used to working in randomly assigned groups and producing group work for discussion at the end of the class session. The third class session introduced the layout of STATA’s user interface without presenting any statistical content. While using initial sessions in this way took time from curricular topics, the class ultimately covered the same amount of material as the previous year’s lecture-driven section. The students in this section also became more proficient programmers in STATA than those in the previous year. Students acclimated to the group work expectations quickly and effectively. By the fourth week, most student groups were producing useful STATA output and providing competent answers to thinking task questions.
The first half of the course focused on descriptive statistics—including measures of central tendency and dispersion, data visualization (tables and graphs), and correlation. Based on pre-course survey results, many of the students (70 percent) had some familiarity with basic statistical concepts. Given this, the short scripts introducing the thinking tasks were more than enough to refresh and extend their understanding, and to add the programming component. Indeed, student questions during group work in the early portion of the course were more focused on STATA programming, a further indication that the concepts were already understood.
Figure 1a,b are two examples of students’ Jamboard frames from the first half of the course. These examples are relatively strong; not all groups clearly delineated results from answers, and many frames lacked organization. Consolidation discussions at the end of class sessions, and student performance on the midterm assessment, indicated that most groups had thoroughly attempted the tasks and understood the associated concepts. Knowledge mobility happened early in the semester when students shared techniques for populating the Jamboards with screenshots from the Stata results window.
3.4. Mid-Semester Adaptation
The latter portion of the course focused on inferential statistics and hypothesis testing—including confidence intervals, independent samples t-tests, and one-way analysis of variance (ANOVA). Responses to an open-ended question in the midpoint survey, and informal feedback from individual students, indicated that students wanted additional lecture content as the course topics were becoming less familiar. To be responsive to student requests, the instructor devoted additional time—usually half of the first day’s class each week—to explaining the statistical concepts and demonstrated basic STATA programming associated with the thinking tasks. Though this was a marked departure from the BTC model, lecture time was kept to a minimum, and student groups still completed thinking tasks independently.
Figure 2 is an example of a Jamboard frame from the latter portion of the course. Notably, the written content in the Jamboards is more extensive, which reflects both the complexity of the task questions and students’ increased statistical concept knowledge. Both student questions during group work, and consolidation discussions in the latter half of the course focused more on statistical concepts—such as meeting assumptions for statistical tests and interpreting test statistics—as students had become proficient in the STATA programming language.
3.5. Students’ Perceptions of Preparation and Learning
The midpoint survey was intended to gauge students’ perceptions of the active learning structure of the course relative to their prior and concurrent course experiences. Active learning was new to most students; 75 percent of students indicated that they had never taken a group-work based course before, and only 10 percent had done so in college. When asked to compare this course to other courses, students in this course generally felt they prepared less and learned more. Forty-five percent of students indicated preparing less for this course than others, with the remaining 55 percent saying their preparation was about the same. This finding is important given that 55 percent indicated learning more in this course than others, with 30 percent indicating about the same amount of learning.
In open-ended survey responses, where students were provided space to reflect in their own words, one student commented positively on the group-work structure:
“I’m really enjoying this class, because of the social/groupwork aspect—it makes it fun to come to class and chat with new people every time—and also because I feel like I actually understand the materials. I think I would really struggle to stay engaged and learn the materials in a regular lecture-type class, so I appreciate you taking the time to switch up the class for our benefit! In my opinion, it is really working.”
Another student was more tentative:
“I think the group work is helpful, but I would appreciate a little more lecture on Stata and some of its functions at the beginning of class. This software is still a little tricky and I feel like sometimes in our groups, one person moves quickly because they know what they’re doing, but can’t explain it well enough for me to understand.”
3.6. Students’ Perceptions upon Course Completion
The final student evaluations—which utilized the college’s standard form—allowed for additional insights on students’ preparation for and satisfaction with the course. Like the midpoint survey, data from the final evaluations indicated that most students (61 percent) spent between 0 and 3 hours per week preparing for class, even though 79 percent found the course “moderately challenging” or “very challenging”. One student reflected on this question:
“I am not a math-oriented person, so I was definitely intimidated by the material, but because of the structure of the class, it was easy to learn and understand.”
Also like the midpoint survey, nearly all students found the class sessions “very worthwhile” (84 percent) or “moderately worthwhile” (11 percent). Commenting on this question, one student remarked:
“I really liked the groupwork format; I always hated math lectures in high school so the groupwork format was new to me, and I think I learned much more than I would have in a lecture-based class.”
Most students (74 percent) felt that the two independent assessments (exams) were thoroughly connected to the work they carried out in class. One student commented on this latter connection:
“Because this class was very hands-on, you really had to be paying attention during class in order to perform well on the exams (you couldn’t really cram or cheat). I thought this was a fantastic structure and I felt like the course material allowed me to demonstrate my knowledge to its fullest.”
When asked for general reflections at the end of the evaluation questionnaire, three students directly commented on the structure of the course:
“I want to express how much I learned this semester in this class on a subject that I felt very nervous about at the beginning. I am not a strong math/computer person so I was very hesitant. However, I think this course ended up as the best case scenario, since it was so hands-on and peer-centered, which is seemingly how I work best in these settings.”
“The group style is the best format of the class I could think of. It helped with applied learning it was the most helpful of a math class I’ve taken.”
“I liked the group work aspect because of the direct application and learning, as well as talking about it directly after class. Felt I was actually absorbing the material because of this.”
3.7. Other Observations
Three observations came from beyond the stated sources of study data. In terms of student interaction, the daily random grouping meant that each student worked with every other student in the class at least once during the semester. In a college setting where students’ social spaces are often stratified by race and/or socioeconomic status, this was an unintended but very welcome consequence.
In terms of curriculum, the scripted thinking tasks initially pared down the material to focus on those concepts and program commands needed for the thinking tasks. In class sessions, however, discussion and presentation of additional concepts and program commands emerged from group work. These emergent discussions made the class sessions more dynamic and fun, allowing for instruction that was directly related to work students were doing in the classroom.
The in-class independent assessments were observed from the rear of the classroom, which allowed students’ screens to be visible for the exam period. Observing the midterm, it was clear that many students in this section had developed programming skills that exceeded those of the prior class section during the final exam. This increased facility may have been a result of random grouping—which forced students out of their typical comfort zone. Another explanation is the general structure of the class—which focused on daily and continuous use of statistical software.
4. Discussion
Following on extensive research showing the benefits of active learning in STEM fields in general [9] and introductory college mathematics courses [10], this case study focused on implementing an active learning model in an introductory statistics course for students majoring in sociology. The author adapted Liljedahl’s [12] “Building Thinking Classrooms” model to a course in statistics for the social sciences. Data for the case study were collected from student-produced group work and from anonymous surveys conducted at the midpoint and end of the term.
Considering the first research question, the BTC model proved adaptable to the learning goals of the course. The conceptual and programming elements of the course were well-suited to Liljedahl’s scripted thinking task structure [12]. The practical issues presented by fixed classroom furniture and the need to work on computers were resolved using solutions suggested in Liljedahl’s [13] supplemental text. These adaptations could allow for a similar course to be offered in a hybrid or virtual classroom.
Returning to the second research question, students’ responses to the midpoint survey and the final course evaluation indicated that they carried out less outside preparation, but still learned as much or more than in other classes. These findings reflected the structure of the course, which required little advance preparation. They also correspond with research on improved student performance in active learning [9].
Finally, with respect to third research question, open-ended responses to both the midpoint survey and the final course evaluation elaborated on students’ experiences in the course. Explicitly and implicitly, students favorably compared active learning in this course to lecture-based instruction. Importantly, students reported feeling more at ease learning mathematics—here, statistics—in an active group-based structure. These findings correspond with more systematic research on the mechanisms behind groupwork in science courses [15].
This is a case study of a single course section, so it makes no claims to generalizability. But the results are important for at least two reasons. It is a successful implementation of an active learning approach to a mathematics course directed at non-STEM students. This attends to the problem of student success introductory or “gateway” mathematics courses identified in both empirical and policy research [2,4]. Moreover, data analytics and data science are high-growth fields, with employment expected to grow [16]. As such, finding engaging and effective ways of teaching such courses is an important area for more rigorous study.
Supplementary Materials
The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14111163/s1, Item S1: Midpoint Survey; Item S2: Example Thinking Task Script.
Funding
This research received no external funding.
Institutional Review Board Statement
Ethical review and approval were waived for this study because the study was conducted in a commonly accepted educational setting involving normal educational practices.
Informed Consent Statement
Respondent consent was waived because the research was conducted in a commonly accepted educational setting and involved normal educational practices. All survey responses were anonymous.
Data Availability Statement
Student work samples are contained within the article. Midpoint survey data are contained in the Supplementary Material. Post-course survey data are not in machine readable format.
Acknowledgments
The author would like to acknowledge Keyla Torres for her support as my teaching assistant in the course, the Center for Teaching Learning at the study institution for supporting this project, Stefanie Chambers for her feedback on an early draft of this manuscript, and Dina Anselmi for her early inspiration in pursuing this project.
Conflicts of Interest
The author declares no conflict of interest.
References
- Burning Glass Institute and Strada Institute for the Future of Work. Talent Disrupted: Underemployment, College Graduates, and the Way Forward. 2024. Available online: https://stradaeducation.org/report/talent-disrupted/ (accessed on 30 September 2024).
- Douglas, D.; Attewell, P. School mathematics as gatekeeper. Sociol. Q. 2017, 58, 648–669. [Google Scholar] [CrossRef]
- Douglas, D.; Salzman, H. Math Counts: Major and Gender Differences in College Mathematics Coursework. J. High. Educ. 2020, 9, 84–112. [Google Scholar] [CrossRef]
- National Research Council. The Mathematical Sciences in 2025; The National Academies Press: Washington, DC, USA, 2013. [Google Scholar] [CrossRef]
- Saxe, K.; Braddy, L. A Common Vision for Undergraduate Mathematical Sciences Programs in 2025; Mathematics Association of America: Washington, DC, USA, 2015; Available online: https://www.cbmsweb.org/archive/Forum5/Common%20Vision%202025%20one%20Pager%20.pdf (accessed on 16 August 2024).
- Liston, C.; Getz, A. The Case for Mathematics Pathways; The Charles, A. Dana Center at the University of Texas at Austin: Austin, TX, USA, 2019; Available online: https://dcmathpathways.org/sites/default/files/resources/2019-03/CaseforMathPathways_20190313.pdf (accessed on 30 August 2024).
- Sepanik, S.; Barman, S. Long-Term Effects of the Dana Center Math Pathways Model; MDRC: New York, NY, USA, 2019; Available online: https://mdrc.org/work/publications/long-term-effects-dana-center-math-pathways-model (accessed on 16 August 2024).
- Armellini, A.; Rodriguez, B.C.P. Active blended learning: Definition, literature review, and a framework for implementation. In Cases on Active Blended Learning in Higher Education; IGI Global: Hershey, PA, USA, 2021; pp. 1–22. [Google Scholar] [CrossRef]
- Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [PubMed]
- Kramer, L.; Fuller, E.; Watson, C.; Castillo, A.; Oliva, P.D.; Potvin, G. Establishing a new standard of care for calculus using trials with randomized student allocation. Science 2023, 381, 995–998. [Google Scholar] [CrossRef] [PubMed]
- Briggs, T. Techniques for active learning in CS courses. J. Comput. Sci. Coll. 2005, 21, 156–165. [Google Scholar] [CrossRef]
- EdFutures contributors. Practitioner Research. EdFutures. 2021. Available online: https://edfutures.net/index.php?title=Practitioner_Research&oldid=7615 (accessed on 14 October 2024).
- Liljedahl, P. Building Thinking Classrooms in Mathematics; Corwin Mathematics: Thousand Oaks, CA, USA, 2021. [Google Scholar]
- Liljedahl, P. Modifying Your Thinking Classroom for Different Settings: A Supplement to Building Thinking Classrooms in Mathematics; Corwin Mathematics: Thousand Oaks, CA, USA, 2021. [Google Scholar]
- Smith, M.K.; Wood, W.B.; Adams, W.K.; Wieman, C.; Knight, J.K.; Guild, N.; Su, T.T. Why Peer Discussion Improves Student Performance on In-Class Concept Questions. Science 2009, 323, 122–124. [Google Scholar] [CrossRef] [PubMed]
- COMPTia. State of the Tech Workforce 2023; COMPTia: Downers Grove, IL, USA, 2024; Available online: www.comptia.org/content/research/state-of-the-tech-workforce (accessed on 30 September 2024).
Figure 1. (a) Example of group work Jamboard frame—Unit on Frequency Distributions. (b) Example of group work Jamboard frame—Unit on Measures of Central Tendency.
Figure 1. (a) Example of group work Jamboard frame—Unit on Frequency Distributions. (b) Example of group work Jamboard frame—Unit on Measures of Central Tendency.
Figure 2. Example of group work Jamboard frame—Unit on Analysis of Variance (ANOVA).
Figure 2. Example of group work Jamboard frame—Unit on Analysis of Variance (ANOVA).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).