Welcome to Rishabh Iyer's webpage

Assistant Professor, University of Texas Dallas

I am currently an Assistant Professor at the University of Texas, Dallas, where I lead the Machine Learning and Optimization Lab. I'm also a Visiting Assistant Professor at the Indian Institute of Technology, Bombay. Before this, I was a Senior Research Scientist at Microsoft between 2016 till 2019. Below are some of the areas I am currently working on:

  • Efficient Learning, data subset selection, and coresets: obtaining 5x - 10x speedups/energy efficiency with small data subsets with negligible loss in accuracy (generalization performance)

  • Active Learning: learning with fewer labels, reducing labeling costs by 2x - 5x (see this blog page on some of our work in active learning)

  • Robust Learning in the presence of outliers, noise etc.

  • Data Summarization: Video/Image/Text (summarize massive datasets with scalable discrete optimization)

  • Model Compression/Pruning, Feature Selection, Cost-sensitive Feature Selection (reduce model size for deployment in resource-constrained environments)

  • Learning with Rules, Labeling Functions, and Data Programming

  • Discrete Optimization (specifically submodular optimization)

  • Combinatorial (Submodular) Information Measures

I also have a strong theoretical understanding of discrete and continuous optimization, as well as a strong experience in getting real-world machine learning, computer vision, text, and NLP problems to work! For more on my research, please see my research page or my publications.

I completed my Ph.D in 2015 from University of Washington, Seattle where I worked with Jeff Bilmes. I am excited in making machines assist humans in processing massive amounts of data, particularly in understanding videos and images. I am interested in building intelligent systems which organize, analyze and summarize massive amounts of data, and also automatically learn from this.

I received the best paper awards at Neural Information Processing Systems (NeurIPS/NIPS) in 2013, the International Conference of Machine Learning (ICML) in 2013, and an Honorable Mention at CODS-COMAD in 2021. I also won a Microsoft Research Ph.D. Fellowship, a Facebook Ph.D. Fellowship, and the Yang Award for Outstanding Graduate Student from University of Washington.

For more information, please see my Google Scholar Profile, LinkedIn Profile, DBLP, or my GitHub page. I also maintain a YouTube channel where I add videos of my lectures and research talks.

Twitter: @rishiyer

Awards and Recognition

  • Honorable Mention for our paper at CODS-COMAD 2021

  • Finalist in the LDV Computer Vision Conference, New York in 2017

  • Yang Outstanding Graduate Student Award, University of Washington, Seattle

  • Microsoft Research Fellowship Award, 2014

  • Facebook Fellowship Award. 2014 (Declined in favor of Microsoft)

  • Best Paper Award at the International Conference of Machine Learning, 2013

  • Best Paper Award at the Neural Information Processing Systems Conference, 2013

Work Experience and Education

  • Spring 2020 to Present, Assistant Professor at the CS Department, UT Dallas

  • August 2020 to Present, Visiting Assistant Professor at CSE Department, IIT Bombay

  • March 2016 - December 2019, Senior Research Scientist, Microsoft

  • March 2015 - March 2016, Post-Doctoral Researcher, University of Washington

  • September 2011 - March 2015, M.S and Ph.D, University of Washington, Seattle

  • August 2011 - May 2011, B.Tech, IIT-Bombay


Recent news

  • Our paper on Concave Aspects of Submodular Functions accepted at ISIT 2020.

  • Invited Speaker and Participant at the Workshop on Optimization in Machine Learning at IST Austria, May 2020

  • Invited Speaker at the Information Theory and Applications (ITA) workshop at San Diego, CA in February 2020

  • Our paper on Robust Submodular Minimization accepted at ECAI 2020!

  • I'm teaching (and designing) a new course at UT Dallas in Spring 2020 on Optimization in Machine Learning (Course Website). This course will cover the basics of both continuous and discrete optimization in ML. This course will be a mix of theory and practical (implementational) aspects of continuous and discrete optimization.

  • I've joined the CS Department of University of Texas, Dallas in Spring 2020 as an Assistant Professor.

  • September 2019: Visited Tata Institute of Fundamental Research for an Invited Talk

  • February 2019: Visited University of Texas at Dallas and University of Pittsburgh in February 2018 and gave a talk on Scalable and Practical Discrete Optimization for Big Data (see this link).

  • December 2018: Two papers accepted into AISTATS 2019!

  • October 2018: Tutorial Speaker at the 7th IEEE Winter Conference on Applications of Computer Vision (WACV) 2019 (see tutorial website. Slides are on the website)

  • October 2018: Three papers accepted to WACV 2019!

  • October 2018: Video Analytics software developed with collaborators at IIT Bombay available now at this link.

  • October 2018: Invited Talk at Allen Institute of AI and Google Seattle, October 2018 (Video Link)

  • July 2018: Released Open Source software Jensen with my collaborators John Halloran and Kai Wei

  • May 2017: Presented our work on Online Learning for Click Prediction at the Microsoft Machine Learning, AI and Data Science Conference

  • May 2017: Finalist at the LDV Vision Conference, New York

  • March 2017: Invited Speaker at AMS Sectional Meeting, Special Session on Geometry and Optimization in Computer Vision, Pullman, WA

  • March 2017: Our work on Limited Vocabulary Speech Data Subset Selection selected to Appear in Computer Speech & Language, 2017. Corpus Definitions and Baselines for SVitchboard-II and FiSVer-I datasets can be found at this link.

  • April 2016: Work on Minimizing Ratio of Submodular Function accepted at ICML 2016

  • Feb 2016: Finished my PostDoc. Will be Joining Microsoft, starting March 2016.

  • Two Papers accepted in NIPS 2015, Two Papers in AISTATS 2015, One Paper in ACL and INTERSPEECH 2015 and one paper in ICML 2015

  • Invited Speaker at the International Symposium on Mathematical Programming (ISMP), Pittsburg - July, 2015 (Session on Submodular Optimization, Link)

  • Invited Lecturer at the Non-convex Optimization for Machine Learning (NOML) Summer School, IIT Bombay, India, June 2015

  • Successfully defended in March 2015!

Selected Publications

For the complete list of publications and workshop papers, see my publications page or my research page.