My research explores efficiency problems in machine learning systems, focusing on how intelligent algorithmic design
can eliminate resource requirements that limit broader participation in AI development and deployment.
I'm currently investigating several interconnected approaches: agreement-based cascading methods for efficient model
routing, semantic approaches to extracting parallelism from natural language queries, and forward-pass techniques for model compression.
My broader research vision centers on demonstrating that accessibility and performance are not opposing forces; that thoughtful design choices can achieve both.
Steven Kolawole*, Don Dennis*, Ameet Talwalkar, Virginia Smith
TMLR 2025
Develops a training-free cascading framework using ensemble agreement as a confidence signal for model routing,
enabling cost reductions while maintaining or improving accuracy across diverse tasks.
Extends agreement-based cascading to open-ended generation tasks, leveraging meaning-level consensus for cost-effective
routing of language model queries without requiring additional training data or model modifications.
Steven Kolawole, Keshav Santhanam, Virginia Smith, Pratiksha Thaker
NeurIPS 2025 D&B Track
Introduces a benchmark revealing that 10% of natural user queries contain latent parallelism,
and demonstrates semantic decomposition methods for achieving speedups without hardware modifications.
Steven Kolawole*, Lucio Dery*, Jean-François Kagy, Virginia Smith, Graham Neubig, Ameet Talwalkar
under review
Presents a forward-pass-only structured pruning method that outperforms gradient-based approaches
while using significantly less memory, making model compression accessible on everyday hardware.
My commitment to making AI research more accessible extends beyond algorithmic contributions to building
infrastructure for inclusive participation. Every year, I mentor 20+ underrepresented graduate school
aspirants at
STEM for Development, helping them clarify research goals and optimize their applications for Western graduate programs.
I am also a founding organizer at
ML Collective-Nigeria, a grassroots research hub that fosters research outside formal academic structures. We
run peer-led study groups, host research sprints, and also leverage MLC's resources to support members--who now publish at top venues--through mentorship, collaborative projects, and idea exchange.
I organize annual fundraisers enabling African student researchers to attend Deep Learning Indaba and
occasionally contribute to Black in AI's ELAI program. During my undergraduate
years, I helped organize dozens of technical training programs impacting over 3,000 students and personally taught several
hundred students in machine learning, data science, and technical skills.
Outside research, I enjoy powerlifting, amateur boxing, watching LFC matches, and reading diversely.
I'm always interested in conversations that bridge technical work with broader social impact.
The Four Six Years of BSc.
Before joining CMU, I completed my BSc in Computer Science at the
Federal University of Agriculture Abeokuta, Nigeria, advised by
Dr. Adebayo Abayomi-Alli. Academic union strike actions and COVID-19 lockdown extended my bachelor's degree timeline.
Fortunately, this gave me more time than a typical undergraduate to refine my interests
beyond what my immediate environment offered and also amass an eclectic mix of fulfilling experiences.
I had ample time to participate in several hackathons and internships, and I regularly spoke on tools and topics
I liked at tech conferences across different parts of the globe. In early 2021, I started learning to be an independent researcher
with ML Collective
(and made cameo appearances at
Masakhane
and
Cohere For AI), where I am fortunate to be primarily mentored by
Dr. Rosanne Liu
and
Dr. Jason Yosinski.
In the following year, my first completed research project (on sign language understanding) earned me the national AI champion award at the Nigeria Computer Society's AI Summit,
held at Lafia's Government House.
Toward the end, I (along with my friends) designed a real-time opinion mining system for digital assets and secured a
115k USD grant to bring it to life. This enabled me to focus on exploring independent research without worrying much about my living expenses.
I am a "community-taught" ML practitioner; hence, much of those years were dedicated to giving back to our tech communities,
including Data Science Nigeria, Google Developer Student Club, She Code Africa, the National Association of Computing Students,
and ML Collective. This unconventional path—learning and building research capabilities outside formal structures—directly
informs my current work on eliminating barriers to AI accessibility.
During the earliest years, I had stints as a choir director at my local churches, majoring on vocals, drums, and piano.
My BSc years were roundly punctuated by existential crises
[1,
2].
My final year was spent transitioning from award-worthy social awkwardness to an unexpected reputation as a local clown,
all while obsessively fine-tuning my Afro-pop dance skills and embracing the reveler lifestyle.