Overview
Tucson Educational Policy Committee (TEPC) establishes the Examination Review Subcommittee (ERS) as a standing subcommittee of TEPC. This subcommittee holds the primary responsibility to oversee the quality and outcomes of student performance assessment for Years 1 and 2. The “Performance Assessment Plan for the Tucson Years 1 and 2” is the domain of this subcommittee and reports of the plan’s implementation and its outcomes will be delivered regularly to the TEPC and its other subcommittees.
Membership
TEPC regards the guidelines and procedures for the oversight of performance assessment to be a curriculum support function of the Curricular Affairs office. As such, the members of this committee will be appointed by the Associate Dean of Curricular Affairs. The Senior Manager for of Assessment and Evaluation will chair the subcommittee.
The membership of the Exam Review Subcommittee (ERS) is comprised of the following individuals: (Note: it is important that the core membership include members who have not been involved [or who have had minimal involvement] in the construction of the exam items being considered. Because of the nature of work performed by this subcommittee, there may be no student representatives).
- Experts in Student Performance Assessment (Curricular Affairs)
- One Discipline Director
- AMES member or faculty with experience writing MCQs (preferably NBME items)
- Other ad hoc content consultants (Block directors and/or the ERS members can recommend content experts to consult with the committee for specific items.) Optimally, both clinical and basic science content expertise will be represented.
Responsibilities
The ERS will establish the criteria by which exam items are deemed of high quality. These will be applied consistently when making decisions about exam items. The criteria that establish acceptable/unacceptable item statistics will be set, approved by TEPC, and published in the appropriate sections of block director manuals. Other decision criteria may include such situations as when two answers might be accepted.
Because the evaluation of exam items includes a corresponding evaluation of its referential learning objectives, the Exam Subcommittee will be copied on block changes that invoke the Year 1 and 2 Block/Course Changes policy with respect to objectives.
- The ERS will apply the criteria in the preview of items for those exams and quizzes that contribute to more than 10% of a student’s grade in any block. The materials associated with these high stakes assessments should be provided to the ERS less than two to three weeks in advance of the assessment administration for review.
- The ERS will apply the criteria after the delivery of an exam:
- The ERS has the final approval regarding dropping, revising items and the subsequent changes to grading that might arise. Initial raw item/exam performance data must be stored before revisions take place.
- The ERS’s decision to drop an item or otherwise change the scoring of an item will be sent immediately to Block Coordinators or Instructional Technology to recalculate (if needed) prior to the release exam scores.
Policies Related to the Oversight of Examinations and Certification of Exam Items
- High-Stakes Exam Review Policy
- There will be consistent implementation of exams across blocks in Years 1 & 2. This includes regularly scheduled block exams, retake, and remediation exams. Implementation includes procedures for question pre-vetting,
- delivering the exam, post- hoc analysis of exams, and subsequent rules for making adjustments to grades.
- Items for an exam will be selected according to acceptable psychometric parameters based on previous iterations, provided that they are aligned with current competency- based learning objectives.
For academic year 2020-21 and forward, new items, or revisions to current items must be previewed by the ERS and tagged in the test administration software according to ERS specifications prior to inclusion on an exam. - New exam items will be vetted through a consistent process established by the ERS. Approval of an item by this process will certify that it adheres to principles of high-quality item construction. That is:
- its assessment intent is unambiguous
- it is typographically and grammatically correct with an appropriate level of clarity and difficulty, and
- it demonstrates an appropriate level of association to competency-based learning objectives
- For development of new questions and/or revision of underperforming items, beta testing of items may be conducted within exams, provided that they are tagged as such.
After an exam has been delivered
- The block director may direct Instructional Technology to DROP a question in situations in which there is a technical error in item delivery (e.g., question repeated on the exam, an item has two identical foils, images failed to load). The ERS must be notified of any such technical error drops immediately.
- The ERS will be scheduled to meet within 24 hours after the delivery of each midterm exam and within two hours of each final exam. In this way, the subcommittee will be on hand to review exams in a timely manner.
- If review of exam statistics or competency alignment prompts concern about an item after the exam has been delivered, the item will be reviewed by the ERS in accordance with established criteria.
- Student challenges of exam items will be reviewed by the ERS only when the exam statistics for that item prompts concern.
- Results from exams will be posted to the gradebook in Medlearn online. Typically the student scores can be posted within one to two business days due to possible adjustment of items.
Relevant Accreditation Standards
9.4 Assessment System
A medical school ensures that, throughout its medical education program, there is a centralized system in place that employs a variety of measures (including direct observation) for the assessment of student achievement, including students’ acquisition of the knowledge, core clinical skills (e.g., medical history-taking, physical examination), behaviors, and attitudes specified in medical education program objectives, and that ensures that all medical students achieve the same medical education program objectives.
9.6 Setting Standards of Achievement
A medical school ensures that faculty members with appropriate knowledge and expertise set standards of achievement in each required learning experience in the medical education program.
7.4 Critical Judgment/Problem-Solving Skills;
7.6 Structural Competence, Cultural Competence and Health Inequities;
7.9 Interprofessional Collaborative Skills
These standards explicitly ask each medical school to demonstrate how they assess specific knowledge and skills such as: knowledge and understanding of societal needs and demands on health care, basic principles of clinical and translational research, and cultural competency.