R. Teal Witter
In July 2025, I will join Claremont McKenna College as an Assistant Professor of Mathematics and Computer Science. My recent research focuses on randomized algorithms for problems in explainable AI and generative AI. More broadly, I am interested in leveraging ideas from theoretical computer science and machine learning to design provably efficient algorithms.
In May 2025, I will earn my Ph.D. in Computer Science from New York University, where I am fortunate to be advised by Chris Musco and Lisa Hellerstein. My graduate studies were generously supported by an NSF Graduate Research Fellowship. I earned a B.A. in Mathematics and a B.A. in Computer Science from Middlebury College in 2020.
Teaching
Claremont McKenna MATH 166: Data Mining
Course Instructor (Fall 2025)
Middlebury CSCI 1052: Randomized Algorithms for Data Science
Course Instructor (Winter 2024)
Middlebury CSCI 1051: Deep Learning
Course Instructor (Winter 2023, Winter 2025)
Papers
In the tradition of theoretical computer science, an asterisk (*) indicates that authors are listed in alphabetical order.
SEAL: Semantic Aware Image Watermarking
Kasra Arabi, R. Teal Witter, Chinmay Hegde, Niv Cohen
Preprint
Kernel Banzhaf: A Fast and Robust Estimator for Banzhaf Values
Yurong Liu, R. Teal Witter, Flip Korn, Tarfah Alrashed, Dimitris Paparas, Christopher Musco, Juliana Freire
Preprint
Hidden in the Noise: Two-Stage Robust Watermarking for Images
Kasra Arabi, Benjamin Feuer, R. Teal Witter, Chinmay Hegde, Niv Cohen
International Conference on Learning Representations (ICLR 2025)
Provably Accurate Shapley Value Estimation via Leverage Score Sampling*
Christopher Musco, R. Teal Witter
International Conference on Learning Representations (ICLR 2025)
Spotlight Presentation
FairlyUncertain: A Comprehensive Benchmark of Uncertainty in Algorithmic Fairness*
Lucas Rosenblatt, R. Teal Witter
Preprint
Benchmarking Estimators for Natural Experiments: A Novel Dataset and a Doubly Robust Algorithm
R. Teal Witter, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2024)
Minimizing Cost Rather Than Maximizing Reward in Restless Multi-Armed Bandits
R. Teal Witter, Lisa Hellerstein
Preprint
I Open at the Close: A Deep Reinforcement Learning Evaluation of Open Streets Initiatives
R. Teal Witter, Lucas Rosenblatt
AAAI Conference on Artificial Intelligence (AAAI 2024)
Robust and Space-Efficient Dual Adversary Quantum Query Algorithms*
Michael Czekanski, Shelby Kimmel, R. Teal Witter
European Symposium on Algorithms (ESA 2023)
Counterfactual Fairness Is Basically Demographic Parity
Lucas Rosenblatt, R. Teal Witter
AAAI Conference on Artificial Intelligence (AAAI 2023)
A Local Search Algorithm for the Min-Sum Submodular Cover Problem*
Lisa Hellerstein, Thomas Lidbetter, R. Teal Witter
International Symposium on Algorithms and Computation (ISAAC 2022)
Adaptivity Gaps for the Stochastic Boolean Function Evaluation Problem*
Lisa Hellerstein, Devorah Kletenik, Naifeng Liu, R. Teal Witter
Workshop on Approximation and Online Algorithms (WAOA 2022)
How to Quantify Polarization in Models of Opinion Dynamics*
Christopher Musco, Indu Ramesh, Johan Ugander, R. Teal Witter
International Workshop on Mining and Learning with Graphs (MLG 2022)
Oral Presentation
R. Teal Witter
International Conference on Combinatorial Optimization and Applications (COCOA 2021)
A Query-Efficient Quantum Algorithm for Maximum Matching on General Graphs*
Shelby Kimmel, R. Teal Witter
Algorithms and Data Structures Symposium (WADS 2021)
Applications of Graph Theory and Probability in the Board Game Ticket to Ride
R. Teal Witter, Alex Lyford
International Conference on the Foundations of Digital Games (FDG 2020)
Applications of the Quantum Algorithm for st-Connectivity*
Kai DeLorenzo, Shelby Kimmel, R. Teal Witter
Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2019)
More Writing
I wrote lecture notes to accompany Chris Musco’s graduate algorithmic machine learning and data science class. I used a subset of these notes for my own randomized algorithms for data science class.
I developed code-based tutorials on adversarial image attacks, neural style transfer, variational autoencoders, and diffusion for Chris Musco’s graduate machine learning class.
I wrote notes on contrastive learning, stable diffusion, and implicit regularization for my deep learning class.
I curated code-based demos that accompany Chinmay Hegde’s graduate deep learning class and my own undergraduate deep learning class. Recordings of the demos are available here.
After struggling for years, I compiled a how-to guide for NYU’s high performance computing cluster.