Precision and Robustness in Adaptive Testing


Dissertation: An IRT Analysis within an Online Convex Optimization Framework

My dissertation addresses the finite-sample performance of Computerized Adaptive Testing (CAT) algorithms by incorporating the Online Convex Optimization (OCO) framework. It focuses on two complementary themes. On the precision front, I demonstrate that classical information-maximization methods constitute no-regret algorithms under the OCO framework, and I construct finite-sample confidence intervals with anytime-valid properties using martingale concentration inequalities and confidence sequence theory. On the robustness front, I adapt the Upper Confidence Bound (UCB) algorithm from multi-armed bandit literature to design stability-oriented item selection strategies that account for the exploration-exploitation trade-off, addressing premature commitment bias from early-stage estimation error and performance degradation from uncertain item parameters during online calibration.


body.dark-mode, body.dark-mode main * { background: rgb(45, 45, 45); color: #f5f5f5; }