Welcome to Rishabh Iyer's webpage
Assistant Professor, University of Texas Dallas
I am currently an Assistant Professor at the University of Texas, Dallas, where I lead the CARAML Lab. I'm also a Visiting Assistant Professor at the Indian Institute of Technology, Bombay. Before this, I was a Senior Research Scientist at Microsoft between 2016 till 2019. Below are some of the areas my group is currently working on:
Efficient Learning, data subset selection, and coresets: obtaining 5x - 10x speedups/energy efficiency with small data subsets with negligible loss in accuracy (generalization performance)
Active Learning: learning with fewer labels, reducing labeling costs by 2x - 5x (see this blog page on some of our work in active learning)
Robust Learning in the presence of outliers, noise etc.
Data Summarization: Video/Image/Text (summarize massive datasets with scalable discrete optimization)
Model Compression/Pruning, Feature Selection, Cost-sensitive Feature Selection (reduce model size for deployment in resource-constrained environments)
Learning with Rules, Labeling Functions, and Data Programming
Discrete Optimization (specifically submodular optimization)
Combinatorial (Submodular) Information Measures
Our research is currently supported by grants from NSF, Adobe, Google, and UT Dallas Seed grant. Thank you! Our research is motivated by real-world problems in machine learning, computer vision, text, and NLP! For more on my research, please see my research page or my publications.
I completed my Ph.D. in 2015 from the University of Washington, Seattle where I worked with Jeff Bilmes. I am excited about making machines assist humans in processing massive amounts of data, particularly in understanding videos and images. I am interested in building intelligent systems which organize, analyze and summarize massive amounts of data, and also automatically learn from this.
I received the best paper awards at Neural Information Processing Systems (NeurIPS/NIPS) in 2013, the International Conference of Machine Learning (ICML) in 2013, and an Honorable Mention at CODS-COMAD in 2021. I also won a Microsoft Research Ph.D. Fellowship, a Facebook Ph.D. Fellowship, and the Yang Award for Outstanding Graduate Student from the University of Washington.
Awards and Recognition
Honorable Mention for our paper at CODS-COMAD 2021
Outstanding Reviewer Award for NeurIPS 2020 and 2021!
Finalist in the LDV Computer Vision Conference, New York in 2017
Yang Outstanding Graduate Student Award, University of Washington, Seattle
Microsoft Research Fellowship Award, 2014
Facebook Fellowship Award. 2014 (Declined in favor of Microsoft)
Best Paper Award at the International Conference of Machine Learning, 2013
Best Paper Award at the Neural Information Processing Systems Conference, 2013
Work Experience and Education
Spring 2020 to Present, Assistant Professor at the CS Department, UT Dallas
August 2020 to Present, Visiting Assistant Professor at CSE Department, IIT Bombay
March 2016 - December 2019, Senior Research Scientist, Microsoft
March 2015 - March 2016, Post-Doctoral Researcher, University of Washington
September 2011 - March 2015, M.S and Ph.D., University of Washington, Seattle
August 2011 - May 2011, B.Tech, IIT-Bombay
University of Texas at Dallas
University of Washington
Spring 2014: Teaching Assistant for Submodular Functions, Optimization, and Applications to Machine Learning
Fall 2011: Introduction to Electrical Engineering
Indian Institute of Technology Bombay
Tutorials/Workshops at Conferences
Our work on Submodular Information Measures was accepted to Transactions of Information Theory Journal.
Outstanding Reviewer Award from NeurIPS 2021! (received the same award in 2020 as well).
Excited to be giving a tutorial at AAAI 2022 on Subset Selection in Machine Learning: Theory, Applications, and Hands-on. Stay tuned for more updates!
Three papers from CARAML lab are accepted at NeurIPS 2021! Congrats Krishna, Ping, Nathan, and Suraj!
Received an NSF Collaborative Medium Award on Submodular Information Functions with Applications to Machine Learning. Thanks, NSF!
Received gift funding from Adobe for Targeted Subset Selection! Thanks, Adobe!
Received gift funding from Google on Continuous Learning! Thanks, Google!
Together with Abir De, Ganesh Ramakrishnan, and Jeff Bilmes, I am co-organizing a workshop on Subset Selection in Machine Learning: From Theory to Applications at ICML 2021 on July 24th 2021! Workshop page: https://icml.cc/virtual/2021/workshop/8351
Will serve as an Area chair for AAAI 2022
Excited to release submodlib (Github: https://github.com/decile-team/submodlib), a submodular optimization toolkit. Credits to Vishal Kaushal for leading this effort.
Excited to release CORDS (Github: https://github.com/decile-team/cords), a PyTorch-based open-source efficient deep model training and autoML library! Credits to my student Krishnateja Killamsetty for leading this.
Excited to release DISTIL (Github: https://github.com/decile-team/distil), a PyTorch-based open-source active learning toolkit for deep learning! Credits to my students Nathan Beck and Durga Sivasubramanian for leading this.
Two papers (GRAD-MATCH and SELCON) accepted to ICML 2021!
Two papers on rule augmented learning accepted at Findings of ACL 2021 (one short and one long).
Happy to announce that we have released VISIOCITY, a dataset comprising of long videos for video summarization, and more broadly video understanding!
Our work on "A Clustering based Selection Framework for Cost Aware and Test-time Feature Elicitation" received Best Paper Honorable Mention at CODS-COMAD 2021! Congrats Srijita and Sriraam!
I will be presenting a tutorial on Combinatorial Approaches for Data, Topic and Feature Selection and Summarization at IJCAI 2020 with Ganesh Ramakrishnan (presented a similar one at ECAI 2020 earlier this year).
Our paper on Data Subset Selection (GLISTER) accepted to AAAI 2021!
Invited Talk in the Special Session Deep Learning and Information Theory at SPCOM 2020 (Virtual)
Senior Program Committee for AAAI 2021
Selected among the 10% of the Reviewers for NeurIPS 2020
Our paper on Concave Aspects of Submodular Functions accepted at ISIT 2020.
Invited Speaker and Participant at the Workshop on Optimization in Machine Learning at IST Austria, May 2020
Invited Speaker at the Information Theory and Applications (ITA) workshop at San Diego, CA in February 2020
Our paper on Robust Submodular Minimization accepted at ECAI 2020!
I'm teaching (and designing) a new course at UT Dallas in Spring 2020 on Optimization in Machine Learning (Course Website). This course will cover the basics of both continuous and discrete optimization in ML. This course will be a mix of theory and practical (implementational) aspects of continuous and discrete optimization.
September 2019: Visited Tata Institute of Fundamental Research for an Invited Talk
February 2019: Visited University of Texas at Dallas and University of Pittsburgh in February 2018 and gave a talk on Scalable and Practical Discrete Optimization for Big Data (see this link).
December 2018: Two papers accepted into AISTATS 2019!
October 2018: Tutorial Speaker at the 7th IEEE Winter Conference on Applications of Computer Vision (WACV) 2019 (see tutorial website. Slides are on the website)
October 2018: Three papers accepted to WACV 2019!
October 2018: Video Analytics software developed with collaborators at IIT Bombay available now at this link.
October 2018: Invited Talk at Allen Institute of AI and Google Seattle, October 2018 (Video Link)
July 2018: Released Open Source software Jensen with my collaborators John Halloran and Kai Wei
May 2017: Presented our work on Online Learning for Click Prediction at the Microsoft Machine Learning, AI and Data Science Conference
May 2017: Finalist at the LDV Vision Conference, New York
March 2017: Invited Speaker at AMS Sectional Meeting, Special Session on Geometry and Optimization in Computer Vision, Pullman, WA
March 2017: Our work on Limited Vocabulary Speech Data Subset Selection selected to Appear in Computer Speech & Language, 2017. Corpus Definitions and Baselines for SVitchboard-II and FiSVer-I datasets can be found at this link.
April 2016: Work on Minimizing Ratio of Submodular Function accepted at ICML 2016
Feb 2016: Finished my PostDoc. Will be Joining Microsoft, starting March 2016.
Two Papers accepted in NIPS 2015, Two Papers in AISTATS 2015, One Paper in ACL and INTERSPEECH 2015 and one paper in ICML 2015
Invited Speaker at the International Symposium on Mathematical Programming (ISMP), Pittsburg - July, 2015 (Session on Submodular Optimization, Link)
Invited Lecturer at the Non-convex Optimization for Machine Learning (NOML) Summer School, IIT Bombay, India, June 2015
Successfully defended in March 2015!
Krishnateja Killamsetty, Xujiang Zhou, Feng Chen, and Rishabh Iyer, RETRIEVE: Coreset Selection for Efficient and Robust Semi-Supervised Learning, To Appear In Neural Information Processing Systems, NeurIPS 2021
Krishnateja Killamsetty, Durga Sivasubramanian, Ganesh Ramakrishnan, Abir De, Rishabh Iyer, GRAD-MATCH: A Gradient Matching Based Data Subset Selection for Efficient Deep Model Training, To Appear Proc. International Conference on Machine Learning ( ICML) 2021
Durga Sivasubramanian, Rishabh Iyer, Ganesh Ramakrishnan, and Abir De, Training Data Subset Selection for Regression with Controlled Validation Error, To Appear Proc. International Conference on Machine Learning ( ICML) 2021
Krishnateja Killamsetty, S Durga, Ganesh Ramakrishnan, and Rishabh Iyer, GLISTER: Generalization based Data Subset Selection for Efficient and Robust Learning, 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (21% Acceptance Rate)
Ayush Maheshwari, Oishik Chatterjee, KrishnaTeja Killamsetty, Ganesh Ramakrishnan, and Rishabh Iyer, Data Programming using Semi-Supervision and Subset Selection, To Appear in Findings of ACL, 2021 (Long Paper)
Rishabh Iyer, Ninad Khargonkar, Jeff Bilmes, and Himanshu Asnani, Submodular Combinatorial Information Measures with Applications in Machine Learning, The 32nd International Conference on Algorithmic Learning Theory, ALT 2021
Srijita Das, Rishabh Iyer, Sriraam Natarajan, A Clustering based Selection Framework for Cost Aware and Test-time Feature Elicitation, In CODS-COMAD 2021 (Best Paper Honorable Mention, Research Track)
Rishabh Iyer and Jeff Bilmes, A Memoization Framework for Scaling Submodular Optimization to Large Scale Problems, Artificial Intelligence and Statistics (AISTATS) 2019, Naha, Okinawa, Japan
Vishal Kaushal, Rishabh Iyer, Suraj Kothiwade, Rohan Mahadev, Khoshrav Doctor, and Ganesh Ramakrishnan, Learning From Less Data: A Unified Data Subset Selection and Active Learning Framework for Computer Vision, 7th IEEE Winter Conference on Applications of Computer Vision (WACV), 2019 Hawaii, USA (Link to the Video)
Yuzong Liu, Rishabh Iyer, Katrin Kirchhoff, Jeff Bilmes, SVitchboard-II and FiSVer-I: Crafting high quality and low complexity conversational english speech corpora using submodular function optimization, Computer Speech & Language 42, 122-142, 2017 (Corpus Definitions and Baselines for SVitchboard-II and FiSVer-I datasets can be found at this link)
Wenruo Bai, Rishabh Iyer, Kai Wei, Jeff Bilmes, Algorithms for optimizing the ratio of submodular functions, In Proc. International Conference on Machine Learning ( ICML) 2016
Kai Wei, Rishabh Iyer, Shenjie Wang, Wenruo Bai, Jeff Bilmes, Mixed robust/average submodular partitioning: Fast algorithms, guarantees, and applications, In Advances of Neural Information Processing Systems (NIPS) 2015
Kai Wei, Rishabh Iyer, Jeff Bilmes, Submodularity in data subset selection and active learning, International Conference on Machine Learning (ICML) 2015
Sebastian Tschiatschek, Rishabh K Iyer, Haochen Wei, Jeff A Bilmes, Learning mixtures of submodular functions for image collection summarization, In Advances in Neural Information Processing Systems (NIPS) 2014
Rishabh Iyer and Jeff Bilmes, Submodular optimization with submodular cover and submodular knapsack constraints, In Advances Neural Information Processing Systems 2013 (Winner of the Outstanding Paper Award) Link to Video, from 56th Minute.
Rishabh Iyer, Stefanie Jegelka, Jeff Bilmes, Fast semidifferential-based submodular function optimization, International Conference on Machine Learning (ICML) 2013 (Winner of the Best Paper Award)
Rishabh Iyer, Jeff A Bilmes, The Lovász-Bregman Divergence and connections to rank aggregation, clustering, and web ranking, Uncertainty In Artificial Intelligence (UAI) 2013
Rishabh Iyer, Jeff Bilmes, Algorithms for approximate minimization of the difference between submodular functions, with applications, Uncertainty in Artificial Intelligence (UAI) 2012
Funding and Support
Our research is supported graciously by research grants from NSF, Google, Adobe, and UT Dallas startup fund.