The TLRC has invited several faculty and researchers to present their current projects to the UCI community. The presentations are casual, and you are invited to bring your lunch and a friend.
Location: 1030 AIRB
Time: 12-1 pm
|Jan 26, 2017||Kate Loudon||Optimizing the number of alternatives provided in multiple choice questions
Different versions of multiple-choice tests were administered to an advanced undergraduate class in human physiology as part of normal testing in the classroom. The goal was to evaluate whether the number of alternative answers per question influenced the effectiveness of this assessment.
|Feb 23, 2017||Pavan Kadandale||Divide and Conquer: Reimagining the multiple-choice test format
Picking a format for assessing student learning is traditionally seen as a choice between free response (FR) and multiple choice (MC) questions. FR questions are viewed as being able to elicit higher order thinking, but at the cost of being very labor intensive to grade. On the other hand, multiple choice questions, while being easy to grade are usually used for lower order questions. Even when used for higher order questions, the efficacy of the distractors plays a huge role in determining the quality of a MC question, and the kind of thinking required to answer the question successfully. I will discuss my ideas on how we can achieve a better testing format by splitting up a traditional MC exam into separate FR and Number Right Elimination Testing (NRET) MC components.
|Mar 9, 2017||Fernando Rodriguez||Student Study Skills and Academic Success|
|Apr 13, 2017||Brian Sato||Testing the Test: Are Exams Measuring Understanding?
Grades in STEM courses are distributed under the assumption that high-performing students have a strong understanding of the course material. Similarly, multiple examples in the STEM education literature present exam performance as equivalent to student understanding. Despite these assumptions, we have little knowledge of student thinking that accompanies high or low test scores. To investigate this relationship, we performed a series of written and verbal exercises with undergraduates studying biology. Twenty-two participants were presented with previously-utilized exam questions and were instructed to include their train of thought, in writing, as they approached each question in addition to providing an exam-like response. Half the participants then participated in a retrospective interview to describe how they arrived at their answer. We coded the exam-like responses, using an instructor-generated rubric, to award an exam-like score. Using this score as a baseline, we then coded the entirety of participants’ writing for their understanding. We found that for over 25% of rubric items, there was a discrepancy between performance and understanding. These results highlight a potential need to re-evaluate our course assessments and to question the understanding those assessments value. Additionally, our work highlights the use of exams as a means to encourage students to engage with science as an ongoing process rather than a process with an end point.
|Jun 1, 2017||Amanda Holton||Results of Three Years of Flipped Data. Comparison of Full and Partial Flips
The studies discussed assess the impact of flipped instruction on exam performance in introductory chemistry courses. Quantitative statistical comparisons of study time, motivation, performance and post course performance will be presented along with qualitative comments from student surveys. Initial studies showed that non-compliance had rippling negative effects in fully flipped classrooms; this was addressed in future studies. Student comments and perceptions implied that a fully flipped course may be too drastic of a change for the skill, maturity and drive of the students in freshman chemistry courses. A softer approach to flipping was implemented. Instituting “Flipped Fridays” model with active learning lectures on Monday and Wednesdays. Due to control and treatment being in sequential years, post course performance was studied to determine the successful outcomes for the class.
|June 8, 2017||Di Xu||Title: EASEing students into college: Closing the achievement gap through a cohort program
Abstract: A number of reports have called for changes to existing educational practices to increase the quality, number, and diversity of STEM (Science, Technology, Engineering, Mathematics) graduates. The need for such action is coupled to the fact that first-generation, low income, and underrepresented minority (URM) students in STEM fields exhibit disproportionately lower course performance, rates of retention, and continuation to graduate school. Drawing on the theory of learning communities and the existing literature on cohort programs, the Ayala School of Biological Sciences at the University of California Irvine created the EASE (Enhanced Academic Success Experience) initiative, a program designed to aid less-prepared Bio Sci majors. The program was launched in Fall 2016, where all Bio Sci freshmen with a math SAT score equal to or less than 600 were required to enroll in the EASE program. EASE students received supplemental instruction from a senior Bio Sci major, increased academic counseling, and opportunities to interact with faculty. Using a fuzzy regression discontinuity (RD) design, we examine the impact of EASE on a variety of student academic outcomes, including course performance in gateway biology courses and retention within the Biological Sciences major, as well as noncognitive measures, such as sense of belongingness, motivation, and attitudes regarding science.