Psychology as a Biological Science

1

Structuring the Course

Structuring the course.

🧭 Overview

🧠 One-sentence thesis

The instructor structured two anthropology courses around explicit learning outcomes that emphasized synthesis, distinction of key terms, and active engagement rather than memorization, supported by TA-graded confidence measures and aligned with the university's mission to promote diversity and global engagement.

📌 Key points (3–5)

  • Course context: Two courses taught across consecutive quarters (Fall 2017 and Winter 2018) in Anthropology, with different enrollment sizes and student compositions.
  • Learning outcomes: Both courses required students to synthesize concepts, distinguish and define key terms, and describe/assess key concepts; Fall 2017 added an "explain" objective for general education purposes.
  • Assessment approach: TAs graded a confidence measure based on how much prompting students required and how quickly/certainly they responded.
  • Common confusion: Memorization vs. active engagement—the instructor explicitly stated memorization would not suffice; students needed to actively engage with content.
  • Institutional alignment: Courses were designed within a research-intensive university mission focused on diversity, knowledge generation, and public service.

📚 Course design and staffing

📚 Course offerings and enrollment

Two courses were offered in consecutive quarters:

QuarterCourseEnrollmentStudent compositionCourse type
Fall 2017Diversity and Race~280 studentsRelatively even across grade levels, slightly more sophomoresGeneral education + diversity requirement
Winter 2018Diversity and Health~170 studentsMostly third-year studentsMajor requirement, cross-listed with Global Health Program
  • Both courses were housed in the Department of Anthropology.
  • The second course had dual listing with the Global Health Program, indicating interdisciplinary reach.

👥 Teaching assistant structure

  • Fall 2017 (pilot): Five TAs, all doctoral students in Anthropology.
  • Winter 2018: Three TAs, including one returning TA from the pilot who took a lead role in training the others.
  • All seven TAs across both quarters were doctoral students in Anthropology, ensuring disciplinary expertise.
  • The returning TA served as institutional memory, informing new TAs of the assessment process.

🎯 Learning outcomes framework

🎯 Core learning outcomes (both courses)

The instructor provided learning outcomes at the beginning of each quarter, listed on syllabi and discussed in class. Students were expected to complete the course with the ability to:

  • Synthesize: the anthropological concept of race (Fall 2017) or health (Winter 2018).
  • Distinguish and define: key terms relevant to course content.
  • Describe and assess: key concepts within the discipline.

These outcomes were specific to each class's content but shared a common structure emphasizing conceptual understanding and critical engagement.

📖 Fall 2017 additional outcome

  • Explain key concepts: This additional objective was included because the course served as an undergraduate general education requirement.
  • Purpose: to promote diversity on campus and in society at large.
  • This outcome was not carried forward to Winter 2018, which served a different curricular function (major requirement rather than general education).

⚠️ Explicit expectations about learning approach

  • Winter 2018 syllabus: The instructor explicitly noted that "memorization would not suffice."
  • Active engagement: Students were expected to actively engage with course content as an explicit expectation.
  • Don't confuse: The courses were not designed for rote learning; the assessment and outcomes required students to work with concepts, not just recall them.

📊 Assessment methodology

📊 Confidence measure grading

TAs graded a confidence measure based on two criteria:

  • How much prompting students required: More prompting indicated lower confidence or understanding.
  • How quickly and certainly students responded: Faster, more certain responses indicated higher confidence.

Example: A student who answers immediately with certainty would receive a different confidence score than one who hesitates and requires multiple prompts to arrive at an answer.

🔍 Purpose of confidence assessment

  • The confidence measure was tied to the learning outcomes, assessing not just correctness but students' certainty and independence in applying concepts.
  • This approach aligns with the "active engagement" expectation—students needed to demonstrate they could work with material confidently, not just reproduce memorized information.

🏛️ Institutional context

🏛️ University mission alignment

The university is described as:

  • Large, public, research-intensive institution.
  • Mission: to transform the region and create a diverse global society.
  • Methods: educating, generating and disseminating knowledge and creative works, and engaging in public service.

🔗 Course alignment with mission

  • The research and course design align with the university's philosophy (the excerpt notes "Our research aligns with our university philosophy").
  • The Diversity and Race course explicitly served the university's goal to "promote diversity on campus and in society at large."
  • Both courses contributed to the diversity curriculum, supporting the institutional mission to create a diverse global society.
2

Setting

Setting.

🧭 Overview

🧠 One-sentence thesis

The oral examination was implemented in large undergraduate anthropology courses at a public research university to align with institutional diversity goals and course learning outcomes emphasizing concept synthesis and active engagement rather than memorization.

📌 Key points (3–5)

  • Institutional context: A large, public, research-intensive university with a mission to promote diversity and transform the region through education and public service.
  • Course characteristics: Two anthropology courses (Fall 2017: ~280 students, general education Diversity and Race; Winter 2018: ~170 students, Diversity and Health for majors) with learning outcomes focused on synthesizing concepts, distinguishing key terms, and active engagement.
  • Staffing structure: Five TAs in the pilot quarter, three TAs in the second quarter (all doctoral students in Anthropology); one TA participated in both terms and led training.
  • Grading approach: TAs assessed both correctness and confidence based on how much prompting students needed and how quickly/certainly they responded.
  • Motivation for change: Previous written midterm format faced technical constraints (power outlets, Wi-Fi bandwidth, potential system crashes) that prevented secure in-class administration.

🏫 Institutional and course context

🎓 University mission and alignment

The university's mission is to transform the region and create a diverse global society by educating, generating and disseminating knowledge and creative works, and engaging in public service.

  • The research and oral examination approach align with the university's philosophy and institutional diversity initiatives.
  • The setting emphasizes both educational transformation and explicit commitments to diversity.

📚 Course details and student populations

CourseQuarterEnrollmentStudent compositionCourse type
Diversity and RaceFall 2017~280Evenly distributed across grade levels (slightly more sophomores)General education + diversity requirement
Diversity and HealthWinter 2018~170Mostly third-year studentsMajor requirement (Anthropology + cross-listed with Global Health)
  • Both courses were housed in the Department of Anthropology.
  • The courses served different student populations: general education vs. major-specific requirements.

🎯 Learning outcomes and expectations

📋 Shared learning objectives

The instructor provided course learning outcomes at the beginning of each quarter, listed on syllabi and discussed in class. Both courses expected students to:

  • Synthesize the anthropological concept of race or health
  • Distinguish and define key terms
  • Describe and assess key concepts

🔍 Fall 2017 additional objective

  • Students were expected to explain key concepts.
  • This additional outcome reflected the course's role as a general education requirement meant to promote diversity on campus and in society at large.

⚠️ Explicit anti-memorization stance

  • Winter 2018 syllabus stated that "memorization would not suffice."
  • Active engagement with course content was an explicit expectation.
  • Don't confuse: The focus was on concept-based comprehension and application, not rote learning.

👥 Teaching assistant structure

🧑‍🏫 TA composition and training

  • Fall 2017 (pilot): Five TAs
  • Winter 2018: Three TAs
  • All seven TAs were doctoral students in Anthropology.
  • One TA participated in both quarters and took a lead role in informing other TAs of the process.

📊 TA grading responsibilities

TAs graded the oral examination using two dimensions:

  1. Prompting required: How much guidance students needed to answer questions
  2. Response speed and certainty: How quickly and confidently students responded
  • The confidence measure was based on observable student behavior during the oral exam.
  • This approach assessed both content knowledge and student certainty/engagement.

🔧 Technical constraints that motivated change

💻 Previous midterm format

  • The course midterm was previously structured as a writing assignment: students wrote a letter explaining course concepts to an international student unfamiliar with race and racism in the United States.
  • Students were allowed to take the midterm outside the lecture hall, write in a word processing program, then copy and submit via the Learning Management System Blackboard (BB).

⚠️ Technical warnings and limitations

After consulting Educational Technology Services, the instructor received warnings about conducting an online BB examination in the lecture hall:

  • Lack of power outlets: Insufficient electrical infrastructure for all students
  • Limited Wi-Fi bandwidth: Network capacity issues in the lecture hall
  • System stability: Possibility that BB might crash during the examination period

🚫 Security trade-offs

  • Because students used word processing programs, the instructor could not freeze the BB screen during the examination.
  • The off-site, open-system format created assessment security concerns.
  • These technical and security limitations provided justification for exploring alternative assessment formats (leading to the oral examination approach).
3

Justification for Study

Justification for Study.

🧭 Overview

🧠 One-sentence thesis

The instructor moved from a written midterm to an oral examination format primarily to address academic integrity violations and student feedback that the written exam was too easy to cheat on.

📌 Key points (3–5)

  • Original midterm format: students wrote a letter explaining course concepts to an international student unfamiliar with U.S. race and racism.
  • Technical constraints forced off-site testing: lack of power outlets, limited Wi-Fi bandwidth, and risk of system crashes meant students took the exam outside the lecture hall using word processors.
  • Academic integrity crisis: numerous suspected violations occurred; at least two cases led to quarter-long suspensions after lengthy adjudication processes.
  • Student feedback on cheating: students themselves reported that the written midterm was "too easy to cheat."
  • Common confusion: the shift to oral exams was not primarily pedagogical—it was a direct response to cheating problems and technical limitations.

🚨 The original written midterm and its technical problems

📝 What the written midterm asked

  • Students had to write a letter explaining course concepts to an international student unfamiliar with race and racism in the United States.
  • The format was intended to assess understanding of diversity-related course material.

💻 Why students couldn't take it in the lecture hall

After consulting Educational Technology Services, the instructor received warnings about conducting an online exam via the Learning Management System Blackboard (BB) in the lecture hall:

  • Lack of power outlets: not enough charging points for all students' devices.
  • Limited Wi-Fi bandwidth: the lecture hall network couldn't handle the load.
  • System crash risk: BB might crash during the examination period.

🏠 The workaround and its consequences

To avoid these technical issues, students were allowed to:

  • Take the midterm outside the lecture hall.
  • Write and save their answers in a word processing program.
  • Copy and submit via BB when ready.

Security trade-off:

  • The instructor could not freeze the BB screen during the exam (students needed access to word processing and might need to message the instructor with questions).
  • An honor pledge was posted at the beginning of the midterm as the only safeguard.
  • Example: A student working at home could easily access notes, the internet, or collaborate with others—no proctoring was possible.

⚖️ Academic integrity violations and their aftermath

🔍 Suspected violations

  • Numerous suspected violations of academic integrity occurred.
  • The excerpt does not specify how many total cases were suspected, but describes at least two in detail.

📋 Case 1: Admitted guilt

  • One student admitted guilt to her College Dean.
  • She was suspended for a quarter.
  • The suspension indicates she had prior violations (repeat offender).

📋 Case 2: Contested case with lengthy adjudication

The student's defense:

  • The student maintained his innocence and mounted a case to fight the charge.
  • He provided numerous documents and testimonies from his roommate and other students.
  • He made in-person pleas to the instructor during office hours to withdraw the case.
  • He cited that as an international student, he would not be able to return home for the summer if sanctioned.

The process:

  • A panel consisting of students and faculty adjudicated the case.
  • Once a possible violation case is submitted, it cannot be withdrawn (even if the instructor wanted to).
  • Based on appeals, the process took over a year to adjudicate.
  • The student was ultimately sanctioned with a quarter suspension, indicating a prior violation had occurred.

Don't confuse: The instructor had no power to drop the case once it was submitted—the process was out of their hands.

🗣️ Student feedback on cheating

  • The instructor received student feedback that the written midterm was "too easy to cheat."
  • This feedback came from students themselves, not just the instructor's observation.
  • It suggests widespread awareness that the exam format was vulnerable to dishonesty.

🔄 The rationale for switching to oral assessment

🎯 Direct cause

"The move toward an oral assessment was in part a reflection of this experience."

  • The switch was motivated by the combination of:
    • Technical constraints that forced off-site, unsupervised testing.
    • Numerous academic integrity violations.
    • Student feedback that cheating was too easy.
  • Example: If an oral exam requires real-time, face-to-face interaction, it becomes much harder to cheat or have someone else take the exam.

📚 Context from literature review (brief mention)

The excerpt includes a short literature review noting:

  • Assessment definition: gathering information from multiple and diverse sources to understand learner knowledge, skills, and dispositions.
  • Oral examinations in higher education: common in certain contexts (clinical exams, PhD defenses) but less common in undergraduate education; nearly all literature notes they are considered a novelty by college teachers and students.
  • Don't confuse: The excerpt does not claim oral exams are pedagogically superior in general—it frames them as a solution to a specific integrity and logistics problem.
4

Assessment in Higher Education

Assessment in higher education.

🧭 Overview

🧠 One-sentence thesis

Oral examinations offer significant advantages over written tests—including authenticity, plagiarism resistance, and inclusivity—but remain rare in undergraduate education due to concerns about objectivity and time demands.

📌 Key points (3–5)

  • What assessment is: gathering information from multiple sources to understand learner knowledge, skills, and dispositions; should align with course learning outcomes.
  • Five advantages of oral exams: develop oral communication skills, provide authentic interaction, increase inclusivity (especially for students with disabilities), enable deep understanding through discourse, and resist plagiarism.
  • Why oral exams remain scarce: concerns about reliability/objectivity (written exams perceived as more "objective") and time-consuming nature (students tested sequentially rather than simultaneously).
  • Common confusion: oral exams are sometimes perceived as only for "special populations," yet research shows students with disabilities prefer them by a vast margin, and they benefit all learners.
  • Practical solution: properly trained teaching assistants (TAs) can address the time constraint problem.

📚 Defining assessment in higher education

📚 What assessment means

Assessment is the process of gathering information from multiple and diverse sources to understand learner knowledge, skills and dispositions.

  • Assessment is not just testing; it is a broader information-gathering process.
  • Sources should be multiple and diverse to capture different aspects of learning.
  • The excerpt emphasizes understanding three dimensions: knowledge, skills, and dispositions.

🎯 Alignment with learning outcomes

  • Assessments can take several different formats.
  • Through backwards course design, assessments are intended to align with course learning outcomes.
  • The literature documents creating effective assessments as a challenging process.
  • Texts exist to support educators in building assessments.

🗣️ Oral examinations as an alternative format

🗣️ What oral exams are

  • One means of assessing course learning outcomes through conducting oral examinations rather than traditional written examinations.
  • Common in certain contexts: clinical exams, PhD defenses.
  • Less common in undergraduate education; nearly all literature observes they are considered a novelty by college teachers and students alike (despite their antiquity).

✅ Five major advantages over written exams

AdvantageWhat it meansWhy it matters
Oral communication skillsDevelopment of students' speaking abilitiesEspecially important for graduation and post-graduation employment
AuthenticityIn-person interaction mirrors real-world scenariosWritten tests unlikely to occur again post-college; oral interaction more relevant to employment
InclusivityBetter for diverse learners229 British students with disabilities preferred oral over written examinations by a vast margin
Deep understandingEnables discourse and critiquePossibility of back-and-forth conversation not possible on written examinations
Plagiarism resistanceHarder to cheatDirect interaction makes academic integrity violations more difficult

🎓 Authenticity advantage

  • Sitting for a written test is unlikely to occur again in students' lives post-college.
  • In-person interaction is more aligned with real-world employment scenarios.
  • Example: A graduate will likely need to explain ideas verbally in meetings or interviews, not take written exams.

♿ Inclusivity advantage

  • One study (Waterfield & West, 2005) discussed views of 229 British students with disabilities.
  • These students preferred oral over written examinations by a vast margin.
  • Don't confuse: Some examiners avoid oral exams because they perceive them as being for "special populations," but the research shows they benefit all students, not just those with disabilities.

🧠 Deep understanding advantage

  • Oral exams allow focus on "deep understanding and critique."
  • The possibility of discourse enables examiners to probe student thinking.
  • Written examinations cannot provide this interactive exploration.

🚧 Barriers to adoption

⚖️ Reliability and objectivity concerns

  • The most prevalent concern regards the issue of reliability and objectivity.
  • Written examinations are more likely to be perceived as "objective," at least from the perspective of examiners.
  • This perception is related to concerns about possible bias in grading.
  • Don't confuse: "Objective" here means perceived fairness/consistency, not that written exams are inherently better at measuring learning.

⏰ Time-consuming nature

  • An additional concern is the time-consuming nature of this type of assessment.
  • Key difference from written exams: Rather than all students taking the examination simultaneously during a predetermined period, each student is generally tested alone and sequentially.
  • In many situations, the time constraint might make oral examinations prohibitive.
  • Example: A class of 30 students taking a 2-hour written exam simultaneously requires 2 hours total; 30 students taking 20-minute oral exams sequentially requires 10 hours.

🛠️ Practical solution to time constraints

  • Several examiners mentioned the presence of properly trained TAs as the best solution to this problem.
  • Multiple examiners can conduct oral exams in parallel, reducing total time.
  • TAs must be properly trained to ensure consistency and fairness.

🎭 Concerns about student characteristics

  • Some examiners were concerned that only the most extraverted, confident students would do well on an oral exam.
  • This concern relates to potential bias based on personality traits rather than knowledge.
  • The excerpt does not provide resolution to this concern but notes it as a barrier to adoption.
5

Oral Examinations

Oral examinations.

🧭 Overview

🧠 One-sentence thesis

Oral examinations offer significant pedagogical and integrity advantages over written tests but remain rare in undergraduate education due to concerns about time, objectivity, and potential bias against certain student populations.

📌 Key points (3–5)

  • Why oral exams are rare: Despite clear benefits, they are considered a novelty in undergraduate settings and face concerns about reliability, objectivity, and time requirements.
  • Five major advantages: development of oral communication skills, authenticity, inclusivity for students with disabilities, ability to assess deep understanding through discourse, and resistance to plagiarism.
  • Time constraint challenge: oral exams require sequential individual testing rather than simultaneous administration, making them time-consuming; trained TAs can help address this.
  • Common confusion about equity: some assume oral exams favor extraverted students or disadvantage non-native English speakers, but research shows students with disabilities prefer them; the real challenge is "institutional discourse" that can disadvantage students from diverse backgrounds.
  • Context of adoption: the move toward oral assessment was partly driven by academic integrity violations in online written exams where cheating was "too easy."

🎯 Why oral examinations emerged

📉 Academic integrity problems with written exams

  • The excerpt describes an online midterm examination that resulted in numerous suspected academic integrity violations.
  • One student admitted guilt and was suspended; another fought the charge for over a year before being sanctioned.
  • Student feedback indicated the written midterm was "too easy to cheat."
  • An honor pledge posted at the beginning was insufficient to prevent violations.
  • Key constraint: the online format and inability to proctor in a lecture hall increased cheating possibilities.

🔄 The shift to oral assessment

  • The move toward oral assessment was "in part a reflection of this experience" with cheating.
  • Oral examinations offer resistance to plagiarism (one of the five major advantages listed).
  • Example: Unlike a written exam where answers can be copied or shared, an oral exam requires the student to demonstrate knowledge in real-time interaction.

✅ Five major advantages of oral examinations

🗣️ Development of oral communication skills

  • Oral exams help students develop communication abilities, especially relevant for graduation and post-graduation employment.
  • This skill is directly transferable to professional contexts.

🎭 Authenticity of assessment

  • In-person interaction is more authentic than sitting for a written test.
  • Written tests are "unlikely to occur again in their lives post-college," whereas oral communication is common in employment settings.
  • The assessment format mirrors real-world professional interactions.

♿ Inclusivity for students with disabilities

  • A study of 229 British students with disabilities found they preferred oral over written examinations "by a vast margin."
  • This challenges the assumption that written exams are the most equitable format.
  • Don't confuse: inclusivity for students with disabilities vs. concerns about other populations (addressed separately below).

🧠 Assessment of deep understanding

"Deep understanding and critique" can be assessed via the possibility of discourse, which is not possible on written examinations.

  • The interactive nature allows examiners to probe student thinking through follow-up questions.
  • Written exams are limited to pre-set questions without the ability to explore reasoning.
  • Example: An examiner can ask "why" or "how" follow-ups to assess whether a student truly understands a concept or has merely memorized it.

🛡️ Resistance to plagiarism

  • Oral examinations inherently prevent copying or using unauthorized materials.
  • This advantage became particularly relevant given the "too easy to cheat" feedback on written exams.

⚠️ Barriers to adoption

⏱️ Time-consuming nature

  • The core problem: rather than all students taking the examination simultaneously during a predetermined period, each student is generally tested alone and sequentially.
  • This makes oral examinations "prohibitive" in many situations.
  • Solution mentioned: the presence of properly trained TAs is cited as "the best solution to this problem."
  • Example: If 100 students need 15-minute exams, that requires 25 hours of examination time, versus a single 2-hour written exam session.

📏 Reliability and objectivity concerns

  • Written examinations are more likely to be perceived as "objective," at least from the perspective of examiners.
  • Concerns exist about possible bias in grading oral exams.
  • Why this matters: perceived lack of objectivity may discourage instructors from adopting oral exams even when they would be pedagogically superior.
  • Don't confuse: actual bias vs. perception of bias—the excerpt notes this is about how examiners perceive written exams as more objective, not necessarily that they are.

🌍 Oral exams and diverse student populations

♿ Students with disabilities prefer oral exams

  • Students with disabilities prefer oral examinations over written examinations.
  • Some examiners mistakenly avoid oral examinations because they are perceived as being for "special populations."
  • This perception is backwards: oral exams can be more inclusive, not less.

🗣️ Concerns about language and cultural background

  • Some examiners worry that:
    • Only the most extraverted, confident students would do well
    • Students whose first language is not English would be at a disadvantage
    • Students educated in another country's system would struggle
  • These concerns are "well documented" and involve "significant discrimination" in some contexts.

📚 The challenge of "institutional discourse"

"Institutional discourse": the everyday competencies and practices have to be presented in institutional terms through language that reifies and abstracts these practices.

  • This type of discourse is distinct from:
    • Personal experience discourse
    • Professional discourse
  • Oral examinations involve a "hybrid discourse"—a combination of all three types.
  • Why this matters: this hybridity can pose major problems of evaluation for both examiners and examinees.
  • Who is affected: undergraduate students of various national, ethnic, class, and racial backgrounds can struggle with institutional discourse.
  • Example: A student may understand a concept from personal experience but struggle to express it using the abstract, formal language expected in academic settings.

📖 Context and literature gaps

📚 Rarity despite antiquity

  • Oral examinations are "de rigueur" (standard) in certain contexts:
    • Clinical exams
    • PhD defenses
  • They are "less common in undergraduate education."
  • Nearly all the literature observes that oral examinations are considered a "novelty" by college teachers and students alike, despite their historical use.

🔬 Research gaps this study addresses

  • Most studies examine oral examinations in graduate settings.
  • This study aims to contribute toward integration of oral examinations in:
    • Undergraduate settings
    • Diverse contexts
    • Large undergraduate diversity courses
  • The study is observational (not experimental), with anonymous cross-sectional data.
  • The oral examination was first piloted during Fall 2017 for a university diversity requirement course.
6

Oral Examinations and Diverse Populations

Oral examinations and diverse populations.

🧭 Overview

🧠 One-sentence thesis

Oral examinations offer significant advantages over written exams—including inclusivity for students with disabilities and assessment of deep understanding—but remain rare due to concerns about objectivity, time constraints, and potential disadvantages for non-native speakers and students from diverse backgrounds.

📌 Key points (3–5)

  • Five major advantages: oral exams develop communication skills, provide authentic interaction, support inclusivity (especially for students with disabilities), enable assessment of deep understanding through discourse, and resist plagiarism.
  • Why they remain scarce: concerns about reliability/objectivity compared to "objective" written exams, and the time-consuming sequential nature of testing students individually.
  • Diversity considerations: students with disabilities prefer oral exams, but non-native English speakers and students educated in other countries may face disadvantages due to "institutional discourse" requirements.
  • Common confusion: some examiners avoid oral exams thinking they are only for "special populations," while others worry only extraverted students will succeed—both views overlook the broader benefits and challenges.
  • Implementation gap: oral exams are more common in graduate settings; the literature lacks sufficient research on undergraduate contexts, especially in diverse classroom settings.

🎯 Advantages of oral examinations

💼 Communication skills and authenticity

  • Oral communication development: especially valuable for graduation and post-graduation employment contexts.
  • Authentic interaction: in-person discourse mirrors real-world professional situations, unlike written tests which students are unlikely to encounter again after college.
  • Example: An employer interview or professional presentation resembles an oral exam more than a written test.

🧠 Deep learning and academic integrity

  • Deep understanding and critique: oral exams allow discourse and follow-up questions, enabling assessment of comprehension that written exams cannot capture.
  • Plagiarism resistance: the interactive nature makes cheating significantly harder.
  • Don't confuse: this is not just about "knowing facts" but about demonstrating understanding through real-time explanation and response.

♿ Inclusivity for students with disabilities

One study (Waterfield & West, 2005) found that 229 British students with disabilities preferred oral over written examinations by a vast margin.

  • This represents a major inclusivity advantage that written exams do not provide.
  • The preference was described as "vast," indicating strong student support for this format.

⚠️ Barriers to adoption

📏 Reliability and objectivity concerns

  • Perceived objectivity gap: written examinations are more likely to be perceived as "objective," at least from examiners' perspective.
  • Bias concerns: related worries about possible bias in grading oral exams.
  • This perception issue is the "most prevalent concern" mentioned in the literature, even though oral exams offer other advantages.

⏰ Time constraints

  • Sequential testing problem: rather than all students taking the exam simultaneously during a predetermined period, each student is generally tested alone and sequentially.
  • This makes oral examinations potentially prohibitive in many situations.
  • Proposed solution: the presence of properly trained teaching assistants (TAs) was mentioned by several examiners as the best solution to this time problem.

🌍 Diversity and equity considerations

🎭 Conflicting perceptions about who benefits

PerceptionConcernReality check
"Special populations only"Some examiners avoid oral exams because they see them as only for special populationsStudents with disabilities do prefer them, but benefits extend to all students
"Only extraverts succeed"Worry that only extraverted, confident students will do wellOverlooks that written exams may disadvantage other student types
"Language barrier"Students whose first language is not English or who were educated in another country's system would be at a disadvantageThis concern is well-documented and requires attention

🗣️ Institutional discourse challenges

"Institutional discourse" refers to how "the everyday competencies and practices... have to be presented in institutional terms through language that reifies and abstracts these practices."

  • Three types of discourse:
    1. Personal experience discourse
    2. Professional discourse
    3. Institutional discourse
  • Hybrid discourse problem: oral examinations involve a combination of all three types, creating a "hybrid discourse."
  • This hybridity can pose major problems of evaluation for both examiners and examinees.

🌐 Impact on diverse student populations

  • The institutional discourse challenge was documented in a study of medical doctors (Roberts et al., 2000) but applies broadly.
  • Who may struggle: undergraduate students of various national, ethnic, class, and racial backgrounds.
  • Significant discrimination concern: Roberts et al. (2000) addressed concerns of significant discrimination in oral examinations for the Royal College of General Practitioners.
  • Don't confuse: the problem is not oral exams themselves, but the specific linguistic and cultural demands of "institutional discourse" that may disadvantage certain groups.

🔬 Research context and gaps

📚 Literature gaps

  • Graduate vs undergraduate focus: oral examinations have been predominantly examined in graduate settings, with fewer studies in undergraduate contexts.
  • Need for diverse contexts: the literature lacks sufficient research on implementing oral examinations in diverse undergraduate classroom settings.

🎓 Study aims

  • The current study (described in the excerpt) aims to build upon existing research and fill gaps regarding implementing oral examinations in undergraduate settings.
  • Specific context: large undergraduate diversity courses with diverse student populations.
  • Pilot implementation: the oral examination was first piloted during Fall 2017 for a Diversity and Race course (a university diversity requirement), administered by TAs during regular discussion section times.
7

Oral Examination Pilot

Oral examination pilot

🧭 Overview

🧠 One-sentence thesis

The oral examination pilot in undergraduate diversity courses aimed to address implementation gaps in the literature while navigating challenges of institutional discourse, anxiety, and logistical constraints across two iterations.

📌 Key points (3–5)

  • Institutional discourse barrier: Oral exams require a "hybrid discourse" combining institutional, professional, and personal language, which disadvantages students from diverse backgrounds and non-English-first-language speakers.
  • Pilot context: Implemented in large undergraduate diversity courses (Fall 2017: 279 students; Winter 2018: 167 students) with ethnically diverse populations, filling a research gap since most oral exam studies focus on graduate settings.
  • Procedure evolution: Students received 10 questions one week early, selected one randomly during the exam, and had 60 seconds to answer; logistical adjustments (single TA, extended time, Skype options) emerged from first-day challenges.
  • Common confusion: Institutional discourse vs. professional/personal discourse—oral exams demand all three simultaneously, creating evaluation difficulties for both examiners and examinees.
  • Key concern: Inter-rater reliability and documentation were initial worries, but grading proved straightforward using answer keys and behavioral cues (body language, clarity).

🚧 The Institutional Discourse Challenge

🗣️ What hybrid discourse means

"Institutional discourse" requires presenting "everyday competencies and practices... in institutional terms through language that reifies and abstracts these practices."

  • Oral examinations involve a hybrid discourse: a combination of institutional, professional, and personal experience discourse.
  • This is distinct from simply talking about personal experience or using professional jargon alone.
  • The excerpt emphasizes this hybridity "can pose major problems of evaluation for both examiners and examinees."

⚠️ Who is disadvantaged

  • Students whose first language is not English or who were educated in another country's system face significant disadvantage.
  • The excerpt cites documented discrimination concerns in oral examinations for the Royal College of General Practitioners.
  • Undergraduate students of "various national, ethnic, class, and racial backgrounds" also encounter institutional discourse as a stumbling block.
  • Don't confuse: The barrier is not just language proficiency—it's the specific ability to translate everyday competencies into abstract institutional language.

🎓 Study Design and Context

🎯 Research gap addressed

  • Most oral examination research focuses on graduate settings.
  • This study aims to fill gaps regarding "implementing oral examinations in undergraduate settings."
  • The classroom setting: large undergraduate diversity courses with high ethnic diversity.

👥 Participant demographics

Fall 2017 (279 students):

  • Ages 18–27 with mixed cultural identities
  • Year distribution: 68 First Year, 84 Second Year, 58 Third Year, 69 Fourth Year
  • Campus ethnicity: 37.6% Asian, 19.5% International, 19.1% White, 17.8% Chicano/Latino, 2.5% African American, 0.4% American Indian, 0.2% Pacific Islander, 2.8% Undeclared

Winter 2018 (167 students):

  • Zero first year, 10 second year, 81 third year, 76 fourth year
  • High dropout rate (~20%) during drop-add period; remaining students "may be biased toward self-selected high achievers"

📋 Study nature

  • Observational, not experimental
  • Students not followed longitudinally; all data anonymous and cross-sectional
  • IRB-certified exempt under category 1 (normal educational practices in established settings)

🔧 Fall 2017 Implementation

📝 Pre-exam preparation

Students received:

  • Written instructions on learning outcomes, format, study activities (e.g., "quizzing other students rather than repeatedly rereading lecture notes"), and content scope
  • One week before: list of 10 short-response questions with percentage grading breakdowns—the same questions used during the actual exam
  • In-class practice of both content and structure before examinations
  • Online sign-up for time slots during section meetings

🤝 TA coordination

Initial concerns and solutions:

ConcernSolution
Inter-rater reliabilityTwo TAs present during each exam
No documentation to corroborate gradingTwo groups of TAs grading per section
Tracking progressShared Google Document with schedule and grades updated after each exam
  • TAs could coordinate which students were being tested and note major grading discrepancies between groups.

⏱️ Exam day procedure

  • Students entered one at a time (format met disability requirements for individual exam rooms)
  • Selected a slip of paper with one question from an envelope
  • 60 seconds to answer + 30 seconds for TAs to prompt for missed portions
  • Each slip contained grade breakdown and unique tracking number
  • TAs had answer keys with correct responses
  • Grading was immediate; grades logged on Google Document and written on the slip for student recordkeeping

🔄 Mid-pilot adjustments

Problem identified on Day 1:

  • Testing took longer than 90 seconds, causing schedule delays
  • Some students had classes directly after and couldn't stay
  • Primary cause: student nervousness (needed relaxation time, took longer than 60 seconds due to anxiety)

Solutions implemented:

  • Day 1: TAs split into three testing rooms to speed up process
  • Day 2: TAs proctored individually (no issues with schedule)
  • Rationale: Grading was "remarkably straightforward" because TAs had answers and could rely on cues like body language and response clarity

🔄 Winter 2018 Iteration

📝 Pre-exam enhancements

Similar to Fall 2017, plus:

  • Additional practice run during class: students simulated picking questions from envelope and answering
  • When students struggled, instructor involved the entire class to participate
  • Instructor clarified any concept concerns from the question set
  • Students reminded that 20% of assessment was based on confidence levels in responding

🌐 Logistical adaptations

No discussion sections for this class:

  • Exams scheduled on two different weekdays outside regular lecture hours
  • TAs created online sign-up via Google Sheets
  • Approximately 11 students who couldn't attend conducted exams over Skype
  • Second sign-up sheet circulated for students with class/work conflicts; TAs offered additional Skype times

📊 Outcome context

  • 144 student evaluations of teaching (SETs) submitted
  • High dropout rate (~20%); average grade during drop-add was D before normalization
  • Remaining students "may be biased toward self-selected high achievers"

🎯 Key Procedural Insights

✅ What worked

  • Advance question distribution: Giving students all 10 questions one week early reduced uncertainty
  • In-class practice: Multiple practice runs helped ease anxiety and clarify concepts
  • Flexible scheduling: Skype option accommodated students with conflicts
  • Single-TA proctoring: Proved sufficient after initial two-TA setup; grading was straightforward with answer keys and behavioral cues

🧠 Grading reliability

  • Initial worry: inter-rater reliability and lack of documentation
  • Reality: "Grading was remarkably straightforward"
  • TAs relied on:
    • Answer keys in front of them
    • Body language cues
    • Clarity of responses
  • These factors allowed TAs to "determine whether an individual knew the complete answer"

⏰ Time management lessons

  • Original 90-second estimate (60s answer + 30s prompt) was insufficient
  • Student anxiety was the primary delay factor, not content difficulty
  • Splitting into more rooms and individual proctoring resolved scheduling issues
  • Don't confuse: The time problem was not about question difficulty but about managing student nervousness in a high-stakes oral format
8

Oral Examination, Version Two

Oral examination, version two

🧭 Overview

🧠 One-sentence thesis

The second version of the oral examination refined the procedure by adding a practice run, allowing two questions per student, and offering Skype options for remote testing, while maintaining immediate grading and individual proctoring.

📌 Key points (3–5)

  • When and who: Winter 2018 quarter with 167 undergraduate students; approximately 11 students took the exam over Skype due to scheduling conflicts.
  • Key changes from version one: students answered two questions instead of one, had an in-class practice simulation, and could "pass" one question for partial credit.
  • Proctoring evolution: individual TA proctoring (not pairs) became standard after the first version showed it was efficient and straightforward.
  • Common confusion: the 20% confidence-based grading component was clarified during the practice run to help ease student anxiety.
  • Participant bias: approximately 20% dropout rate during the drop-add period means remaining students may be self-selected high achievers.

📅 Timeline and participants

📅 When it happened

  • Winter 2018 quarter.
  • Exams scheduled on two different days outside regular lecture hours.
  • Students received the ten possible questions one week before the test.

👥 Who participated

  • 167 undergraduate students total.
  • Class composition: zero first-year; 10 second-year; 81 third-year; 76 fourth-year.
  • Approximately 11 students conducted the exam over Skype.
  • 144 students submitted student evaluations of teaching (SETs).

⚠️ Selection bias note

  • The course had a very high dropout rate of approximately 20%.
  • Average grade during the drop-add period was a D before normalization.
  • Students who remained may be biased toward self-selected high achievers.

🎯 Pre-examination organization

📝 Question distribution and practice

  • Students received the ten possible exam questions one week before the test (same as version one).
  • New addition: an in-class practice run was conducted to help ease anxiety.
    • Students simulated the oral examination by picking a question from an envelope and attempting to answer it.
    • When some students had difficulties, the instructor asked the remainder of the class to participate.
    • The instructor clarified any concerns about the concepts referenced in the questions.

💯 Confidence grading reminder

  • Students were reminded that 20% of the assessment was based upon their levels of confidence in responding to the questions.
  • This reminder was given during the practice run to address anxiety.

📆 Scheduling process

  • No discussion sections existed for this class, so exams were scheduled outside regular lecture hours.
  • TAs created an online sign-up sheet via Google Sheets.
  • A second sign-up sheet was circulated through Google Documents for students who could not attend due to classes and work obligations.
  • TAs offered those students additional times via Skype.

🏛️ Proctoring procedure

🎲 Question selection format

  • Students selected two slips from the envelope (contrast: version one used only one question).
  • Each slip contained a grade breakdown and a unique number for tracking.
  • TAs had a list of all questions with their correct responses.

⏱️ Timing

  • Students were given approximately two minutes to answer each question.
  • Don't confuse: version one allowed 60 seconds to answer plus 30 seconds for TA prompts; version two gave approximately two minutes per question without the separate prompt period mentioned.

🔄 Pass option

  • New feature: if unable to answer one of the randomly chosen questions, students had the option of "passing" the question by answering an alternative question.
  • Students would lose some points for using this option.
  • Students were told about this option in the in-class midterm review.
  • Example: a student pulls two questions from the envelope, struggles with one, and chooses to answer the alternative question instead, accepting a point deduction.
  • Usage: despite a few students initially struggling, only one student utilized the option of answering the alternative question.

✅ Grading process

  • Grading was immediate (same as version one).
  • After logging grades on Google Document, TAs wrote them on the slip of paper that each student took from the envelope.
  • The slip was given back to the student for recordkeeping.

👤 Individual proctoring

  • Examinations were proctored individually by one TA (evolved from version one).
  • The excerpt notes this was "as evolved in the first quarter," referring to the decision made during version one to split into three testing rooms with individual TAs.

📊 Data collection methods

📋 Midterm feedback survey

  • Administered in-class using an internet poll after completing the oral examination.
  • Multiple choice evaluation format.
  • Questions evaluated: midterm goal, format, preparation, and student anxiety taking the exam.
QuarterResponse rate
Fall 2017Approximately 80%
Winter 2018Approximately 70%

📝 Student evaluation of teaching (SET)

  • Administered at the end of the course (university standard).
  • Included multiple choice and free response questions.
  • Two multiple choice questions were relevant to class assignments.
  • One free response question asked students to give their feedback (question text cut off in excerpt).

🔍 Analysis approach

  • Researchers applied multiple coding strategies to the answers and triangulated the data.
  • Open codes were established utilizing a grounded theory approach derived directly from the language used by student and TA participants.
  • Open codes were reviewed and then integrated into thematic categories.
  • Don't confuse: the excerpt describes the methodology but does not present the actual results or thematic categories.
9

DATA COLLECTION

DATA COLLECTION

🧭 Overview

🧠 One-sentence thesis

The researchers collected data through multiple surveys—midterm feedback, end-of-course student evaluations, and TA surveys—and applied grounded-theory coding to understand how students and TAs experienced the shift from written to oral examinations.

📌 Key points (3–5)

  • Multiple data sources: midterm student feedback survey, university student evaluation of teaching (SET), and TA experience survey were all used.
  • Coding strategy: open codes derived from participant language were grouped into thematic categories using grounded theory.
  • High response rates: approximately 70–85% for student surveys and 71% for TA surveys, ensuring accuracy.
  • Common confusion: the SET "Exams, Quizzes, and Papers" section included both general assignment comments and midterm-specific comments; researchers had to differentiate them carefully.
  • Overall student response: Fall 2017 data showed mostly positive feedback, with only eight instances of negative comments about the oral midterm.

📋 Data collection instruments

📋 Midterm student feedback survey

  • Administered after completing the oral examination during class.
  • Used an internet poll with multiple-choice questions.
  • Evaluated: midterm goal, format, preparation, and student anxiety.
  • Response rates:
    • Fall 2017: approximately 80%
    • Winter 2018: approximately 70%

📋 Student evaluation of teaching (SET)

  • University-wide evaluation administered at the end of the course.
  • Included both multiple-choice and free-response questions.
  • Relevant components:
    • Two multiple-choice questions about class assignments.
    • One free-response question asking opinions on assignments.
  • Instructor prompt: students were asked to comment specifically on the oral examination, whether it improved their ability to explain concepts, and whether they preferred it to a written midterm.
  • Response rates:
    • Fall 2017: approximately 75%
    • Winter 2018: approximately 85%

📋 TA midterm survey

  • Administered at the end of each course.
  • Used open-ended free-response questions to evaluate TAs' experiences administering the oral exam.
  • Response rate: 71%

🔍 Data analysis approach

🔍 Grounded theory coding

Open codes were established utilizing a grounded theory approach derived directly from the language used by student and TA participants.

  • Researchers did not impose pre-existing categories; instead, they let codes emerge from the actual words participants used.
  • Process:
    1. Open codes were reviewed.
    2. Codes were integrated into thematic categories.
    3. Categories are presented in the subsequent Data Analysis and Results section.

🔍 Triangulation and legitimacy

  • Multiple coding strategies were applied to answers.
  • Data were triangulated across different sources (midterm feedback, SET, TA surveys).
  • High response rates across all surveys enabled accuracy of results.

📊 Fall 2017 results: student experience

📊 SET response breakdown

  • Total SET respondents: 212 out of 279 students.
  • "Exams, Quizzes, and Papers" section: 64 responses total.
    • 53 responses evaluated assignments in general or specifically evaluated the oral examination.
    • Responses eliminated if they only listed assignments without evaluation.
  • Midterm-specific responses: 36 comments used vocabulary like "oral exam," "the midterm," or "the exam."
    • Responses using "exams" (plural) were included only if they differentiated between exams and quizzes, since the oral midterm was the only exam given.
  • Additional comments: 3 comments in the general class review and 1 in the instructor review also mentioned the oral midterm.
  • Note: responses could fall into more than one category.

📊 General assignment evaluations

Comments about assignments in general fell into two categories:

Evaluation CategoryNumber of EvaluationsExample
Easy12"All the assignments were really easy"
Fair and representative6"I thought the exams were a good way to test our knowledge"
Ambiguous1"Prepares you well" (subject unclear)

📊 Midterm-specific evaluations

More diversity in responses; six main categories emerged:

Evaluation CategoryNumber of EvaluationsExample / Notes
Preference (total)19General liking or specific reasons
– Helped explain concepts7Subcategory of preference
– Helped understand concepts5Subcategory of preference
– Helped retain material3Subcategory of preference
– General preference4Subcategory of preference
Ease17"Midterm and final were fairly easy - do the readings and go to lecture, you'll do fine on both"
Anxiety4"Midterm is a small interview, which can be intimidating or nerve-racking for some students"
Representativeness(not specified)"All of the exams/quizzes were reflective of the material we learned!"
Exam Procedures(not specified)"Oral midterm was not run well. Many students were late to other classes or waited more than 30 minutes"
Difficulty(not specified)"Oral midterm exam took more time for me to study compared to the in-class midterm"

📊 Overall sentiment

  • Positive response: overall positive student response to the oral midterms.
  • Negative feedback: only eight instances documented.
    • Three of these were nested within comments that also gave positive feedback.
  • Don't confuse: some students found the exam easy (17 responses) while others noted it required more study time or caused anxiety; these are not contradictory—they reflect different aspects of the experience.
10

Student Experience: Fall 2017 Diversity and Race Course

Student Experience. Fall 2017: Diversity and Race

🧭 Overview

🧠 One-sentence thesis

Students overwhelmingly preferred the oral midterm examination format over written exams, reporting that it helped them explain concepts despite causing some anxiety, with 97% agreeing it met the goal of preparing them to discuss race and racism.

📌 Key points (3–5)

  • High preference for oral format: 91% of students would not have preferred a written exam, and the oral format was seen as easier or equal in preparation time by 80%.
  • Pedagogical effectiveness: 97% felt the oral exam helped them better explain concepts about race and racism to others; students noted it improved their ability to explain, understand, and retain material.
  • Anxiety vs. preference paradox: Over half (53%) found the oral format more nerve-wracking than written exams, yet still preferred it—anxiety did not override the perceived benefits.
  • Common confusion: Students may conflate "more anxiety" with "worse format," but the data shows preference and effectiveness can coexist with higher stress.
  • Overall positive reception: Only 8 instances of negative feedback appeared in the midterm-specific evaluations, with 3 of those nested within otherwise positive comments.

📊 Survey methodology and response rates

📋 Data collection instruments

The study used three types of surveys:

  • Student Evaluation of Teaching (SET): Standard university course evaluation administered at the end of the term, including a section on "Exams, Quizzes, and Papers."
  • In-class oral examination evaluation: Instructor-administered survey during the course, asking students to evaluate whether the midterm met goals, their format preference, preparation time, and anxiety levels.
  • TA midterm survey: End-of-course evaluation asking teaching assistants about their experiences administering the oral exam (71% response rate).

📈 Response rates

  • Fall 2017 SET: 212 of 279 students responded (approximately 75% response rate).
  • Fall 2017 in-class evaluation: 260 students present and responded.
  • The excerpt notes that "accuracy of results was enabled by a high response rate across all the surveys."

🔍 Coding and categorization rules

  • Responses eliminated if they only listed assignments without evaluation (e.g., "There was an oral midterm and a final video project").
  • Comments using "exams" (plural) included only if they differentiated between exams and quizzes, since the oral midterm was the only exam given.
  • Responses could fall into more than one category.
  • 64 total responses in the SET "Exams, Quizzes, and Papers" section; 53 evaluated assignments generally or the oral exam specifically.
  • 18 comments about assignments generally; 36 specifically evaluated the oral midterm (plus 4 additional comments from other SET sections).

💬 General assignment evaluations

✅ Two main categories

Comments about assignments in general fell into two categories:

CategoryDescriptionNumber of Evaluations
Easy"All the assignments were really easy"12
Fair and representative"I thought the exams were a good way to test our knowledge"6
AmbiguousSubject unclear ("Prepares you well")1 (excluded)

🎯 What this reveals

  • Students found the overall assignment structure accessible and aligned with course content.
  • The "easy" designation does not necessarily mean trivial—it may indicate clear expectations and adequate preparation support.
  • Fair and representative comments suggest students perceived the assessments as valid measures of learning.

🎤 Midterm-specific evaluations

❤️ Preference for oral format

The largest category of responses (19 total) expressed preference for the oral exam over written alternatives, broken into subcategories:

SubcategoryNumber of EvaluationsWhat students said
Helped explain concepts7Oral format improved ability to articulate ideas
Helped understand concepts5Deepened comprehension of material
Helped retain material3Better memory of course content
General preference4Liked the format without specifying why

Why this matters: The oral format was not just preferred aesthetically—students identified specific cognitive and communicative benefits.

Example: A student might find that preparing to speak about a concept forces them to organize their thoughts more clearly than writing an essay, leading to better understanding.

😰 Anxiety and difficulty

Anxiety (4 evaluations):

  • Students described the oral midterm as "intimidating or nerve-racking."
  • This was the most common negative theme.

Difficulty (2 evaluations):

  • "Oral midterm exam took more time for me to study compared to the in-class midterm."
  • Note: This contrasts with the in-class evaluation data showing 80% felt preparation time was equal or less.

Don't confuse: Anxiety about a format does not mean students reject it—the data shows students can prefer a format that makes them nervous if they perceive it as more effective.

🎯 Other evaluation dimensions

Ease (17 evaluations):

  • "Midterm and final were fairly easy - do the readings and go to lecture, you'll do fine on both."
  • Second-largest category, indicating many students found the exam manageable.

Representativeness (2 evaluations):

  • "All of the exams/quizzes were reflective of the material we learned!"
  • Confirms alignment between exam content and course material.

Exam Procedures (2 evaluations):

  • "Oral midterm was not run well. Many students were late to other classes or waited more than 30 minutes."
  • Logistical concerns about scheduling and wait times.

📉 Negative feedback summary

  • Only 8 instances of negative feedback total.
  • 3 of these were nested within comments that also gave positive feedback.
  • Example: A student might say "The oral exam was nerve-wracking but really helped me learn to explain concepts."

📊 In-class evaluation results

🎯 Meeting course goals

Goal: "Helping prepare students to better explain concepts about race and racism to other people."

  • 97% of students who responded felt the oral exam met this goal.
  • This represents near-universal agreement on pedagogical effectiveness.

📝 Format preference

  • 91% said they would NOT have preferred responding to questions in a written format.
  • This strongly corroborates the SET findings about format preference.

⏱️ Preparation time

  • 80% felt the oral format took equal or less time to prepare for than a written exam.
  • This contradicts the minority view (from difficulty evaluations) that oral exams require more study time.
  • Possible explanation: Different students may have different preparation strategies, or the perception of time may vary.

😟 Anxiety levels

  • 53% found it more nerve-wracking to prepare for the oral exam than a written one.
  • This is a majority, confirming that anxiety is a real factor.

Key insight: Despite 53% experiencing more anxiety, 91% still preferred the oral format—effectiveness and learning value outweighed discomfort.

🔄 Comparison with Winter 2018 course

📊 SET quantitative data comparison

"Assignments promote learning":

ResponseFall 2017Winter 2018
Strongly Disagree1.9%1.4%
Disagree4.4%2.1%
Neither A nor D11.2%9.0%
Agree55.8%45.8%
Strongly Agree26.7%41.0%
  • Winter 2018 showed a shift toward "Strongly Agree" (41.0% vs. 26.7%).
  • Combined Agree/Strongly Agree: Fall 82.5%, Winter 86.8%.

"Exams are representative of the course material":

ResponseFall 2017Winter 2018
Strongly Disagree0.5%0.0%
Disagree0.5%1.4%
Neither A nor D4.4%4.2%
Agree48.8%33.3%
Strongly Agree43.9%59.7%
  • Winter 2018 showed even stronger agreement: 59.7% "Strongly Agree" vs. 43.9% in Fall.
  • Combined Agree/Strongly Agree: Fall 92.7%, Winter 93.0%.

🎤 Winter 2018 in-class evaluation highlights

  • 100% of students felt the oral exam helped them remember key concepts (vs. 97% in Fall for "explain concepts"—slightly different wording).
  • 91% would not have preferred a written format (identical to Fall 2017).
  • 74% felt preparation time was equal or less (vs. 80% in Fall—slight decrease but still a strong majority).
  • 47% found it more nerve-wracking (vs. 53% in Fall—slight decrease in anxiety).

🔍 Consistency across courses

The Winter 2018 data reinforces Fall 2017 findings:

  • High preference for oral format persists across different course topics (Diversity and Race vs. Diversity and Health).
  • Anxiety remains a factor but does not override preference.
  • Perception of representativeness and learning promotion remained high or improved.
11

Winter 2018: Diversity and Health

Winter 2018: Diversity and Health

🧭 Overview

🧠 One-sentence thesis

An oral midterm examination in a diversity and health course achieved high student satisfaction and learning effectiveness, with 100% of students reporting it helped them remember key concepts, though nearly half experienced more anxiety than with written exams.

📌 Key points (3–5)

  • Student preference: 91% of students preferred the oral format over written, and 100% felt it helped them remember key concepts.
  • Preparation time: 74% of students felt the oral exam required equal or less preparation time than a written exam.
  • Anxiety trade-off: 47% found preparing for the oral exam more nerve-wracking than a written one, despite preferring the format.
  • Common confusion: Students can feel more anxious yet still prefer and benefit more from oral exams—anxiety and effectiveness are separate dimensions.
  • TA benefits: Teaching assistants reported more efficient grading, clearer assessment of conceptual understanding, and reduced ambiguity compared to written exams.

📊 Student evaluation results

📈 Overall assignment ratings

The Winter 2018 course had 144 of 167 students respond to the Student Evaluation of Teaching (SET).

Assignments promote learning:

  • 86.8% agreed or strongly agreed (66 agreed, 59 strongly agreed)
  • Only 3.5% disagreed or strongly disagreed
  • 9.0% were neutral

Exams are representative of course material:

  • 93.0% agreed or strongly agreed (48 agreed, 86 strongly agreed)
  • Only 1.4% disagreed
  • 4.2% were neutral

🎯 Midterm-specific evaluation

119 students were present for the in-class midterm evaluation.

QuestionResult
Will oral format help remember key concepts?100% yes
Would prefer written format instead?91% no (prefer oral)
Spent more time preparing than written exam?74% equal or less time
More anxious preparing than written exam?47% yes (more anxious)

🎓 Teaching assistant perspectives

⏱️ Efficiency and grading benefits

TAs consistently reported that oral exams saved time and simplified grading:

  • One TA noted: "Selfishly, for me as a reader, this greatly reduced the amount of time I had to spend on grading."
  • TAs could record answers on the same day.
  • The instructor provided a clear answer key that enabled quick and consistent grading.

Reduced ambiguity:

  • TAs could ask students to elaborate on answers in real time.
  • One TA explained: "those constant discussions with students after written examinations about what they were trying to say, simply isn't a concern in the oral examination because you resolve what the student means in the moment."
  • Students could not claim memory lapses afterward since they had opportunities to clarify during the exam.

🧠 Conceptual understanding assessment

TAs observed that oral exams better revealed students' true comprehension:

  • The format "spoke to the concept-learning focused pedagogy of the Professor."
  • One TA noted: "the benefit of an oral midterm is found in its ability to encourage creative thinking and synthesis in a conversational format."

Detecting memorization vs. understanding:

  • TAs could identify when students had simply memorized answers without grasping the material.
  • Example: Students who couldn't elaborate further or mixed up basic terms/concepts revealed incomplete understanding that might have been hidden in written responses.

😰 Student nervousness and mitigation strategies

All seven TAs noted that students were initially nervous, though this typically lessened during the exam:

  • "Once they sat down and took the exam, they were able to respond fluidly and without difficulty."
  • Students waiting in line felt relieved as others exited and reported it had gone well.
  • Immediate grade feedback (and generally good results) boosted confidence of waiting students.

Anxiety reduction techniques: One TA who administered oral exams in both courses developed specific strategies:

  • Used a casual setting (graduate lounge) to alleviate anxiety.
  • Avoided sitting behind a desk; instead sat to the side of students to diminish hierarchical structure.
  • Monitored her own body language to avoid expressions of rigidity.
  • Allowed students brief moments to breathe deeply and restructure their thoughts when needed.

Don't confuse: Initial nervousness with poor performance—most students performed well despite anxiety, and one TA noted that anxiety contributed to poor performance among only a few students.

📝 Student response categories

📋 General assignment evaluations

From 61 responses evaluating assignments, 19 were about assignments generally:

CategoryNumber of responses
Representativeness14
Ease6
Difficulty1

🎤 Midterm-specific evaluations

42 responses specifically evaluated the oral midterm (plus 2 from general class reviews):

CategoryNumber of responses
Preference39
Better for retaining material13
Understanding concepts10
Less stressful8
Generally8
Representativeness7
Ease7
Anxiety3
Difficulty1

The distribution shows students most frequently commented on their preference for the format, followed by its benefits for retention and understanding.

🔄 Comparison with Fall 2017

📊 Similar patterns across quarters

Both Fall 2017 (Race and Racisms course) and Winter 2018 (Diversity and Health) showed:

  • High preference for oral format (91% in both quarters would not prefer written)
  • Strong agreement that the exam met learning goals (97% in Fall 2017; 100% in Winter 2018 for remembering concepts)
  • Majority felt preparation time was equal or less (80% in Fall 2017; 74% in Winter 2018)
  • Significant portion experienced more anxiety (53% in Fall 2017; 47% in Winter 2018)

📈 Assignment promotion of learning

Winter 2018 showed slightly higher strong agreement:

RatingFall 2017Winter 2018
Strongly Agree26.7%41.0%
Agree55.8%45.8%
Combined positive82.5%86.8%

📈 Exam representativeness

Winter 2018 showed higher strong agreement:

RatingFall 2017Winter 2018
Strongly Agree43.9%59.7%
Agree48.8%33.3%
Combined positive92.7%93.0%

The shift toward "Strongly Agree" in Winter 2018 suggests refinements to the oral exam format may have increased student confidence in its representativeness.

12

TA Experience with Oral Examinations

TA Experience

🧭 Overview

🧠 One-sentence thesis

TAs found that oral examinations reduced grading ambiguity and better assessed conceptual understanding, though they required strategies to manage students' initial nervousness.

📌 Key points (3–5)

  • Less ambiguity: Students could elaborate in the moment, eliminating post-exam disputes about what they meant to say.
  • Better conceptual assessment: Oral format revealed whether students truly understood concepts or had only memorized answers.
  • Initial nervousness: Most students were anxious beforehand but relaxed once they began; anxiety lessened as waiting students saw others succeed.
  • Common confusion: Don't confuse memorization with understanding—TAs could distinguish these during oral exams by asking students to elaborate or noticing mixed-up terms.
  • Grading efficiency: TAs reported the process was more efficient and simplified compared to written exams.

💬 Reduced ambiguity and in-the-moment clarification

💬 How oral exams eliminate post-exam disputes

  • TAs noted "less room for ambiguity" because students could elaborate on answers during the exam.
  • One TA stated: "those constant discussions with students after written examinations about what they were trying to say, simply isn't a concern in the oral examination because you resolve what the student means in the moment."
  • Students could not later claim memory lapses, since they had opportunities to clarify their answers immediately.

✅ Immediate resolution vs. written exam disputes

FormatAmbiguity handlingPost-exam disputes
Written examInterpretation happens afterStudents argue about what they meant
Oral examClarification happens in the momentNo room for later claims of memory lapse

🧠 Assessing conceptual understanding

🧠 Revealing true comprehension vs. memorization

The oral examination emphasizes students' ability to comprehend and explain course concepts.

  • TAs could identify when students had "simply 'memorized' the right answers" but didn't fully grasp the material.
  • Indicators of shallow understanding:
    • Inability to elaborate further when asked
    • Getting basic terms or concepts mixed up
  • Example: A student might recite a correct answer but cannot explain it in different words or apply it, revealing memorization without understanding.

🎯 Encouraging deeper thinking

  • One TA noted: "I think the benefit of an oral midterm is found in its ability to encourage creative thinking and synthesis in a conversational format."
  • The format aligned with "concept-learning focused pedagogy" rather than rote memorization.
  • Don't confuse: knowing the right answer (memorization) vs. being able to explain and elaborate (understanding).

😰 Managing student nervousness

😰 Initial anxiety patterns

  • All seven TAs commented on students' initial nervousness about the oral examination.
  • For most students, nervousness lessened once they actually took the exam.
  • One TA observed: "Once they sat down and took the exam, they were able to respond fluidly and without difficulty."
  • Waiting students felt relieved as others exited and reported it had gone well.

😰 When anxiety affected performance

  • One TA who administered oral exams in both classes noted that anxiety did contribute to poor performance among a few students.
  • This suggests nervousness was not always overcome and could impact results for some.

🛠️ Strategies to reduce anxiety

The TA who conducted oral exams twice developed specific techniques:

Physical environment:

  • First iteration: Set up the graduate lounge as the testing location, believing a casual setting might alleviate anxiety.
  • Second iteration: Used a traditional office but chose not to sit behind the desk; instead sat to the side of each student to diminish hierarchical structure.

Body language and pacing:

  • Watched her own body language to avoid expressions of rigidity.
  • Allowed students brief moments to breathe deeply and restructure their thoughts when needed.

Confidence building:

  • Gave students their grades immediately after the exam.
  • Since students did well, this helped boost the confidence of those still waiting.

📊 Efficiency and grading process

📊 Simplified grading

  • TA responses reflected "increased efficiency and simplification of the grading process."
  • The excerpt does not detail specific mechanisms, but TAs consistently noted this benefit alongside reduced ambiguity.

📊 Connection to course learning outcomes

  • Student feedback addressed the final course learning outcome: "explain key concepts around race and racism."
  • The oral examination was created specifically to assess students' ability to explain course content.
  • One student noted: "the oral exam was great and surprisingly easy to do I felt I had a better understanding of the information because I was able to explain it to someone else."
  • Another stated: "The oral exam did help increase my ability to explain things to other people."
  • This shows the format directly measured the intended learning outcome, even though some students preferred written exams due to anxiety.
13

Course Learning Outcomes

Course Learning Outcomes

🧭 Overview

🧠 One-sentence thesis

Oral examinations successfully achieved the course learning outcome of helping students explain key concepts about race and racism to others, with students reporting better retention and understanding despite heightened anxiety.

📌 Key points (3–5)

  • Primary outcome measured: the ability to explain course content (not just recall facts), specifically concepts around race, racism, and health.
  • Student-reported success: 97–100% of students stated the oral exam helped them remember and explain key concepts.
  • Retention vs. memorization: students reported the format forced deeper understanding rather than short-term memorization.
  • Common confusion: anxiety about the unfamiliar format vs. actual difficulty—students were initially more anxious but found the exam "surprisingly easy" and preferable to written tests.
  • Why it matters: the oral format directly assessed the final learning outcome and improved material retention beyond traditional written exams.

📚 Learning outcomes for the courses

📚 Diversity and Race course (Fall 2017)

The course had four learning outcomes:

  1. Distinguish and define key terms (ethnicity/race/nationality, ethnocentrism/racism/nationalism, implicit bias/prejudice/discrimination)
  2. Describe and assess differences in racialization and racial systems among various racial minorities
  3. Synthesize an anthropological concept of race (biological, socio-cultural, and historical perspectives)
  4. Explain key concepts around race and racism (social construction of race, systemic racism, racial privilege)

🏥 Diversity and Health course (Winter 2018)

The course had three learning outcomes:

  1. Synthesize an anthropological concept of health (socio-cultural, biological, and historical perspectives)
  2. Distinguish and define key concepts (culture, ethnomedicine, health inequality, structural violence, biopolitics, syndemics)
  3. Describe, assess, and apply how key concepts relate to various health issues in global contexts

🎯 How the oral exam targeted the final outcome

🎯 Focus on explanation ability

  • The oral examination was created specifically to assess the final course learning outcome: the ability to explain concepts.
  • Student feedback from course evaluations directly addressed this outcome, even though the evaluation did not explicitly prompt them about it.
  • Example: One student noted, "His push to make sure we are able to explain things to others by not just taking a normal written exam really is a good one."

📊 Direct measurement results

The midterm feedback included a question directly asking if the oral exam helped students explain concepts:

CoursePercentage who said "yes"What students reported
Fall 2017 (Race)97%Oral exam helped explain concepts around race and racism
Winter 2018 (Health)100%Oral exam helped remember key concepts

💬 Student testimony on explanation

  • "I felt I had a better understanding of the information because I was able to explain it to someone else."
  • "The oral midterm helped me solidify the concepts learned in the class because we were forced to talk about the concepts and put me in the position to essentially teach the TA about the concepts."
  • One student acknowledged anxiety but recognized long-term benefit: "The oral exam did help increase my ability to explain things to other people. I would not prefer this to a written exam just because of anxiety issues but I guess this could help me in the long run."

🧠 Retention and deeper understanding

🧠 Long-term memory vs. short-term memorization

Students repeatedly contrasted the oral format with traditional memorization:

  • "To this day, remember the questions, and the answers to the questions. It truly made me learn the concepts and not short-term memorize them."
  • "Instead of studying to memorize key terms, I had to fully understand the concept, which helped me remember and fully conceptualize the topics."
  • "It helped myself retain the information much better. Instead of studying to memorize key terms, I had to fully understand the concept."

🔍 Why the format forced deeper processing

  • Students had to talk about and teach the concepts, not just write them down.
  • The format required "more effort… to actually know the material and examples."
  • One student specified retention of a particular concept: "It definitely helped me retain and understand the key term of structural violence."

⚠️ Don't confuse

  • Not just recall: the oral exam did not measure simple retrieval of facts; it required processing information in novel ways to explain it aloud.
  • Not just any learning outcome: while the courses had multiple outcomes (distinguish, describe, synthesize), student feedback overwhelmingly addressed the explanation outcome, which the oral exam was designed to assess.

😰 Anxiety and student experience

😰 Heightened anxiety reported

  • Students were initially more anxious about the oral examination because it was an unknown format.
  • Despite largely positive findings, heightened testing anxiety was a key feature of oral examinations.
  • One student noted: "I would not prefer this to a written exam just because of anxiety issues."

🛠️ Strategies to mitigate anxiety

The TA employed several techniques from the first to the second iteration:

  • Setting: chose a graduate lounge (first iteration) for a more casual environment; some students commented the room felt "nice."
  • Furniture arrangement: in the second iteration, sat to the side of each student instead of behind a desk to diminish hierarchical structure.
  • Body language: watched her own body language to avoid expressions of rigidity.
  • Pacing: allowed students brief moments to breathe deeply and restructure their thoughts when needed.

✅ Anxiety vs. actual difficulty

  • Common confusion: students expected the oral exam to be harder because of their anxiety, but many found it "surprisingly easy."
  • "The oral exam was great and surprisingly easy to do."
  • "It was a lot less stressful than traditional midterm exams."
  • Students spent about the same amount of time studying compared to a written exam, suggesting the anxiety was about the format, not the workload.

📝 Comparison to written exams

📝 Student preferences

Students compared the oral examination to traditional written midterms:

AspectStudent feedback
Ease"Surprisingly easy"; "a lot less stressful than traditional midterm exams"
Learning"Made me learn the material better"; "more conducive to learning"
Retention"Helped myself retain the information much better"
Effort"More effort required of you to actually know the material and examples"

🎓 Why students preferred it

  • Immediate feedback: students appreciated "leaving the session with their grade rather than waiting for grading to occur."
  • Forced understanding: "There's more effort required of you to actually know the material and examples."
  • Example: "Having to do an oral midterm actually made me learn the material better because there's more effort required of you to actually know the material and examples."

⚖️ Trade-offs acknowledged

  • Not unanimous: "While there was not unanimous agreement on enjoying the process of oral examination," the responses revealed it was a direct measurement of the final learning outcome.
  • Anxiety remained a concern for some students, particularly those with anxiety issues or non-native English speakers (mentioned at the end of the excerpt).
14

Oral Examinations: Student and Teaching Assistant Experiences

DISCUSSION

🧭 Overview

🧠 One-sentence thesis

Oral examinations in large lecture courses generated highly positive responses from students and teaching assistants despite initial anxiety, with students reporting improved concept retention and preference over written exams.

📌 Key points (3–5)

  • Initial anxiety vs. eventual preference: Students were more anxious about oral exams because of the unfamiliar format, but after preparatory modeling they found them easier and preferable to written exams.
  • Perceived learning benefits: Students reported that oral exams improved their understanding, retention, and ability to explain concepts, aligning with course learning outcomes.
  • TA advantages: Teaching assistants praised the format for reducing grading time and providing clear, transparent scoring with immediate feedback to students.
  • Common confusion: Anxiety does not mean poor outcomes—students performed well (95–97% received A grades) despite heightened nervousness.
  • Alignment with outcomes: The exam format successfully prepared students for real-world discussions and matched explicitly stated course goals like explaining concepts to others.

😰 Student anxiety and preparation

😰 Initial anxiety from unfamiliar format

  • Students reported higher anxiety because oral examinations were an unknown assessment method.
  • The excerpt notes this aligns with existing literature: oral exams consistently produce more nervousness than written ones.
  • Possible reasons for anxiety include:
    • Expectations that oral tasks require deeper understanding
    • Having to perform socially in a professional setting

🛠️ Preparatory modeling reduced concerns

  • The instructor provided:
    • Study suggestions
    • A review day for practicing questions before the exam
  • After this preparation, students felt the format was "generally easier and preferable to written examinations."
  • Students spent about the same amount of time studying as they would for a written exam.

📈 Performance despite anxiety

  • Students performed well: 97% received an A in Fall 2017, 95% in Winter 2018.
  • The excerpt notes that one study found students were more nervous in oral exams but performed better compared to written ones.
  • Don't confuse: High anxiety does not mean poor performance or that the format is ineffective.

🧠 Learning and retention benefits

🧠 Improved understanding and retention

Students noted in their evaluations that the format improved their understanding of concepts and retention of material.

  • The instructor expected oral exams would "force students to process the information in novel ways."
  • Student self-reports confirmed this expectation.

🔍 Critical thinking over simple retrieval

The excerpt cites existing studies showing oral exams promote:

  • Increased critical thinking and analysis (rather than simple retrieval)
  • Improved reflective thinking
  • Enhanced creativity and interaction between faculty and students

🗣️ Alignment with diversity education goals

  • Critical thinking and reflection are important components of diversity education.
  • Teaching diversity is "not simply about knowledge acquisition but opportunities for affective experiences that lead to changes in attitude and worldview."
  • Example: 97% of students in the Diversity and Race course stated the oral exam helped them explain concepts around race and racism to other people—an explicitly stated course outcome.

👥 Teaching assistant perspectives

⏱️ Reduced grading time

  • TAs highly praised the format for reducing grading times.
  • This was a primary objective: reducing the load for graders and TAs who typically run two discussion sections.
  • Common assumption challenged: Much oral examination literature focuses on extra time needed to administer exams, but this experience found the opposite.

📋 Clear scoring and immediate feedback

  • Clear rubrics were provided and assessed transparently.
  • TAs preferred this format because it "reduced judgments about scoring."
  • Students appreciated "leaving the session with their grade rather than waiting for grading to occur after turning in a written exam."

🔄 Implementation improvements

ImplementationChallengeSolution
Fall 2017 pilotDifficulties in coordination and timing; students given relaxation time created backlogs
Second implementationLonger time windows for sign-ups; format proved feasible using Google Documents

🤔 TA concerns

  • Student anxiety in the oral format may reduce performance.
  • The format may be more difficult for non-native English speakers, particularly those better at written English.
  • The excerpt notes this population was not examined in prior inclusive assessment research, and warrants further exploration at institutions where approximately 20% of students are international.

🎯 Learning outcomes and course design

🎯 Priming effect of stated outcomes

It is possible the learning outcomes from the syllabus primed students for the experience they would have.

  • Fall 2017 course: Learning outcome was "explaining course concepts" → students stressed liking the oral exam for that reason.
  • Winter 2018 course: Learning outcome was "to understand" → students focused on how it helped them apply and retain concepts.
  • Student responses on evaluations reflected these different emphases.
  • This suggests students were clear about course learning outcomes and that oral examinations can effectively steer students toward different learning goals.

📊 Maturation and motivation effects

  • In the second implementation, anxiety was markedly lower even though the examination had effectively doubled.
  • Possible reasons:
    • The class was particularly motivated (upper-level required course for the major)
    • Maturation effect: students' comfort with speaking in academic settings improves over time
  • Students were less likely to comment on increased difficulty, suggesting it was proportional to expectations in an upper-level course.

🔄 Attempts to increase grade variance

The instructor tried to create more variation in outcomes by:

  • Increasing the number of questions
  • Incorporating a confidence score
  • Allowing students to pass a question

These attempts did not lead to greater variation (still 95–97% A grades).

Future plan: Ask students to define two key terms and make connections via course examples, requiring conceptual work that cannot necessarily be practiced and memorized in advance.

📚 Spread and adoption

📚 Format adoption by other instructors

Since Fall and Winter 2017–2018, two other instructors used this approach in Spring 2018:

  • A former TA reproduced it as an instructor for her own Diversity and Race course.
  • Another former TA proposed the format for a linguistic anthropology course taught by a different instructor.
  • The new instructor believed the format "made it clear whether the students knew the material, when they could speak about the concepts in this assessment format."

⚠️ Study limitations

⚠️ Methodological constraints

  • The study is retrospective and based on cases from one instructor's courses.
  • Based on continuous improvement of assessment instruments and reflection by a new assistant professor, TAs, and student self-reports.
  • Student preferences via evaluations "are not objective measures for the validation and success of an assessment."
  • Coding of qualitative data was conducted by one individual.

📊 Sample limitations

  • Limited to students of this instructor in two courses that vary significantly:
    • Lower-level diversity requirement
    • Upper-level major requirement
  • Student populations and motivations differed between the two cases.

🔬 Lack of experimental design

  • The study did not experimentally randomize participation in oral versus written examinations.
  • Cannot comment on potential performance differences between formats.
  • Cannot establish actual increase in learning, only student reporting of perceived benefits.
  • To establish actual learning increase, would need a randomized control trial comparing oral to written exam methodologies.
  • Without an independent measure over time, would only know difference in exam performance rather than actual learning.

📖 Limited literature

As the data on oral assessments are limited and there is minimal literature on this assessment form, results must be viewed with caution.

Only one cited study (Huxham et al., 2012) used randomization between oral and written exam conditions to assess differences in student performance.

🔮 Future research directions

🔮 Needed studies

  • Additional studies are necessary to make clear judgments about efficacy and efficiency.
  • Future studies could use randomized control designs powered to detect significant differences.
  • Example approach: randomizing half of the students or discussion sections to oral versus written conditions.
15

Limitations

LIMITATIONS

🧭 Overview

🧠 One-sentence thesis

This retrospective study of oral examinations in large lecture classes relies on student self-reports and instructor reflection rather than objective measures, so results must be interpreted cautiously and require further controlled research to validate efficacy.

📌 Key points (3–5)

  • Study design weakness: retrospective case study based on continuous improvement and reflection, not a controlled experiment.
  • Data source limitations: relies on student evaluation of teaching (SET) surveys, which are not objective measures of learning outcomes.
  • Coding and sample constraints: qualitative data coded by one person; sample limited to one instructor's two courses with different student populations.
  • Common confusion: student preference (measured by SETs) vs. actual learning—positive student feedback does not prove better learning outcomes.
  • Need for future work: randomized controlled trials, objective measures of retention and understanding, and time-cost analysis are necessary to establish true efficacy.

🚧 Core methodological weaknesses

🔄 Retrospective design

  • The study looks back at cases that already occurred, rather than planning a controlled experiment in advance.
  • Based on continuous improvement of assessment instruments and reflection by a new assistant professor teaching large lectures for the first time.
  • Also draws on reflections from TAs (graduate students in a PhD program) and student self-reports in SETs.
  • Why this matters: retrospective studies cannot control for confounding variables or establish causation as rigorously as prospective designs.

📊 Reliance on student evaluations

Student preferences via SETs are not objective measures for the validation and success of an assessment.

  • The study acknowledges that SETs measure student perception, not actual learning.
  • Don't confuse: positive student experience ≠ proof of better learning outcomes.
  • Example: students may report that the oral exam format is "easier and preferable," but this does not demonstrate they learned more or retained information longer.

🔍 Single-coder analysis

  • Qualitative data was coded by one individual, introducing potential bias.
  • No mention of inter-rater reliability or validation by a second coder.
  • This reduces confidence in the interpretation of qualitative responses.

🎯 Sample and context limitations

👥 Limited sample scope

  • Sample restricted to students of one instructor across two courses.
  • The two cases vary significantly in student populations and motivations:
    • Lower-level diversity requirement course
    • Upper-level major requirement course
  • Different student motivations and course levels make it difficult to generalize findings.

📚 Minimal existing literature

  • The excerpt notes that "data on oral assessments are limited and there is minimal literature on this assessment form."
  • Only one prior study (Huxham et al., 2012) used randomization between oral and written exam conditions.
  • Implication: without a strong evidence base, these exploratory findings cannot be confidently generalized.

🔬 What is missing for validation

🧪 Need for controlled experiments

  • Future studies should use randomized control designs:
    • Randomize half of students or discussion sections to oral vs. written exam conditions.
    • Studies must be "powered to detect significant differences" (have sufficient sample size).
  • Example approach: randomly assign discussion sections of a large lecture to either oral or written exam format, then compare outcomes.

📏 Objective learning measures needed

Future research should test:

  • Retention: differences in student understanding at shorter and longer intervals (one week, one month, six months).
  • Conceptual processing: how well students process and apply conceptual information.
  • Self-efficacy: confidence in explaining concepts to others.
  • Actual application: real utilization of the material in practice.
  • Don't confuse: self-reported learning (what students say they learned) vs. measured learning (what they can demonstrate over time).

⏱️ Time and cost analysis

  • The excerpt calls for close measurement of time commitments:
    • Preparing students for the exam
    • Administering the exam
    • Grading both types of exams
  • Goal: determine actual time savings or costs.
  • Why this matters: oral exams may reduce TA grading time (important where union contracts restrict TA hours or where graduate students need more research time), but preparation and administration time must be weighed against these benefits.

🔮 Additional research directions

🧑‍🎓 Different student populations

  • Future work should examine differences in exam modalities for various groups:
    • Students with various disabilities (potential benefits not yet explored).
    • Older vs. younger students (older students may be more ingrained in traditional exam styles and thus more appreciative of different formats).
    • Student maturation and practice speaking in courses.
    • Student commitment to class content as potential mediators of examination outcomes.

📉 Novelty vs. standardization

  • The study observed a decrease in anxiety between the two quarters an oral examination was administered.
  • Questions to explore:
    • Does novelty of the format make a difference compared to standardized assessment practice?
    • How does repeated exposure to oral exams affect student performance and anxiety?

🛠️ Scaffolding and preparation

  • Types of scaffolding and preparation for students should be explored more thoroughly to reduce student anxiety.
  • The excerpt notes that students were "initially more anxious about the oral examinations because they were an unknown format."
  • Better understanding of effective preparation methods could improve student experience and outcomes.
16

FURTHER WORK AND CONCLUSION

FURTHER WORK AND CONCLUSION

🧭 Overview

🧠 One-sentence thesis

Future research on oral examinations should use randomized controlled designs to measure efficacy, efficiency, time costs, and differential impacts across student populations, while also exploring scaffolding strategies to reduce anxiety.

📌 Key points (3–5)

  • Current evidence gap: Only one study (Huxham et al., 2012) has used randomization to compare oral vs. written exams; more rigorous designs are needed.
  • What future studies should measure: student understanding and retention over time, conceptual processing, self-efficacy, actual application of material, and differences across student populations.
  • Time commitment uncertainty: Future work must closely measure preparation, administration, and grading time to determine actual savings or costs.
  • Common confusion: novelty vs. standardized practice—older students may respond differently than younger students; maturation and practice speaking may mediate outcomes.
  • Practical constraints: scaffolding and preparation strategies need exploration to reduce anxiety; union contracts and TA hour limits may influence feasibility.

🔬 Research design needs

🎲 Randomized control designs

  • The excerpt notes that only one study (Huxham et al., 2012) has used randomization between oral and written exam conditions.
  • Future studies should:
    • Randomize half of students or discussion sections in a large lecture to oral or written exam conditions.
    • Power the design to detect significant differences.
  • Example: In a large lecture course, half the discussion sections receive oral exams, half receive written exams, then compare outcomes.

📊 What to measure for efficacy

The excerpt lists several outcome measures future studies could test:

Outcome dimensionWhat to measure
RetentionDifferences in student understanding at shorter and longer intervals (week, month, six-month follow-up)
Conceptual processingHow students process conceptual information
Self-efficacyConfidence in explaining concepts to others
ApplicationActual utilization or application of the material
  • Don't confuse: "efficacy" here means learning outcomes, not just student satisfaction or preference.

👥 Student population differences

🧩 Various student groups

  • Future work should examine differences in exam modalities for different student populations.
  • The excerpt specifically mentions:
    • Students with various disabilities may benefit.
    • Older students may be more ingrained in traditional exam styles and thus more appreciative of a different format.
  • Example: An organization could compare oral exam outcomes for students with learning disabilities vs. those without.

🔄 Novelty vs. standardization

  • The excerpt raises the question: does novelty of the format make a difference compared to standardized assessment practice?
  • Older students may respond differently than younger students due to greater experience with traditional exams.

📈 Maturation and practice mediators

  • The excerpt observed a decrease in anxiety between the two quarters an oral examination was administered.
  • Potential mediators to explore:
    • Student maturation (i.e., practice speaking in courses).
    • Commitment to the class content.
  • Don't confuse: the decrease in anxiety could be due to practice or to greater investment in the course, not just the format itself.

⏱️ Time and resource considerations

⏲️ Time commitment measurement

Future inquiries should closely measure the time commitments involved in preparing students for, administering, and grading both types of exams to determine how much time savings or costs, if any, are present.

  • Current data do not clearly establish whether oral exams save or cost time.
  • What needs measurement:
    • Preparation time for students.
    • Administration time.
    • Grading time.

🎓 Scaffolding and preparation

  • The types of scaffolding and preparation for students should be explored more thoroughly to reduce student anxiety.
  • Example: Providing practice sessions, rubrics in advance, or peer practice opportunities.

🏫 Practical constraints in large courses

  • The excerpt mentions potential benefits for large courses where:
    • TA grading needs to be minimized (e.g., to increase focus on graduate student research).
    • Union contracts restrict TA hours.
  • However, time constraints of other oral examination formats (e.g., short interviews in smaller class settings) should also be explored to weigh costs and benefits.
  • Don't confuse: the excerpt does not claim oral exams always save time; it calls for studies to determine whether they do.

🚧 Study limitations context

📝 Retrospective nature

  • The excerpt notes this study is retrospective and based on cases that may interest instructors with large lecture classes.
  • Data sources:
    • Continuous improvement of assessment instruments (such as SETs—student evaluation of teaching).
    • Reflection by a new assistant professor teaching large lecture classes for the first time.
    • TAs (graduate students in a PhD program).
    • Student self-reports in SETs.

⚠️ Data limitations

  • Student preferences via SETs are not objective measures for validation and success of an assessment (Uttl et al., 2017).
  • Coding of qualitative data was conducted by one individual.
  • Sample limited to students of this instructor; the two cases vary significantly (lower-level diversity requirement vs. upper-level major requirement).
  • Student populations and motivations differ between the two cases.

🔍 Caution in interpretation

As the data on oral assessments are limited and there is minimal literature on this assessment form, results must be viewed with caution.

  • The excerpt emphasizes that additional studies are necessary to make clear judgments about efficacy and efficiency.
    Psychology as a Biological Science | Thetawave AI – Best AI Note Taker for College Students