Name
Post Conference Workshop: Beyond the Bubble: An Introduction to Performance-Based Assessments
Date & Time
Wednesday, March 4, 2026, 1:30 PM - 3:30 PM
Kimberly Swygert Clyde Seepersad John Zarian
Description

Performance-based assessments (PBAs) provide a robust framework for evaluating complex competencies – often those that can’t be fully captured with multiple choice questions. However, even well-designed, authentic, task-centered assessments can present new challenges. This post conference workshop will introduce attendees to foundational elements of PBA, beginning with the identification of performance constructs, those observable behaviors or skills that are aligned with real-world tasks. Effective task design is critical for PBAs, which often include simulations, case-based scenarios, or structured interactions that mirror educational or professional contexts.

Next, the speakers will discuss the diverse response modalities in PBAs, which can include written text, spoken/oral responses, and physical task execution. Each modality introduces unique challenges for reliability and generalizability, particularly due to task dependency and contextual variability. Strategies such as task sampling, rater training, and statistical modeling can be employed to mitigate these issues. Scoring approaches for PBAs range from binary checklists to nuanced rating scales, with both human and automated scoring methods in use. While automated scoring offers scalability, human raters provide nuanced, contextual judgment – and perhaps substantive rater bias that must be addressed through training, calibration, and monitoring. Recent advances in technology, including artificial intelligence, can now be used to implement PBA tasks that were formerly infeasible, unscalable, or prohibitively expensive. Other technological tools, such as tablet testing, allow for enhanced data analysis, auditing, and security measures.

Finally, the speakers will discuss validity threats specific to PBAs, including construct underrepresentation and construct-irrelevant variance; these often stem from poorly aligned tasks, rater bias, or inconsistent scoring. Practical considerations such as development costs, scoring logistics, and return on investment (ROI) will be covered as well, with emphasis on the need for strategic planning to ensure the PBA is a success.

During this session, attendees will learn about:

  1. The basics of development and usage of PBAs in competency-based assessments, primarily in the certification context
  2. The potential variety of modalities for PBA and the impact of modality choice on critical psychometric aspects of the assessment
  3. The different types of threats to security and validity, especially with respect to rater bias
  4. Emerging technical capabilities and technologies to support PBAs
     
Session Type
Post Conference Workshop