Email: steve.hanneke@gmail.com
Mobile Phone: (412) 973-3007
Location: Chicago, IL USA
TTIC Office: 407
I am a Research Assistant Professor at the Toyota Technological Institute at Chicago (TTIC).
I work on topics in statistical learning theory.
Research Interests:
My general research interest is in systems that can improve their performance with experience, a
topic known as machine learning. My focus is on the statistical analysis of machine learning. The essential questions I am interested
in answering are "what can be learned from empirical observation / experimentation," and "how much observation / experimentation is necessary and sufficient to
learn it?"
This overall topic intersects with several academic disciplines,
including statistical learning theory, artificial intelligence, statistical inference, algorithmic and statistical
information theories, probability theory, philosophy of science, and epistemology.
Brief Bio:
Prior to joining TTIC, I was an independent scientist working in Princeton 2012-2018,
aside from a brief one-semester stint as a Visiting Lecturer at Princeton University in 2018.
Before that, from 2009 to 2012,
I was a Visiting Assistant Professor in the
Department of Statistics at
Carnegie Mellon University,
also affiliated with the Machine Learning Department.
I received my PhD in 2009 from the
Machine Learning Department at
Carnegie Mellon University,
co-advised by Eric Xing
and Larry Wasserman.
My thesis work was on the theoretical foundations of active learning.
From 2002 to 2005, I was an undergraduate studying
Computer Science
at the University of Illinois at Urbana-Champaign
(UIUC),
where I worked on semi-supervised learning with Prof.
Dan Roth
and the students in the Cognitive Computation Group.
Prior to that, I studied Computer Science at Webster University
in St. Louis, MO,
where I played around with neural networks
and classic AI a bit.
Recent News and Activities:
- Presenting (with Rob Nowak) an ICML 2019 Tutorial on Active Learning: From Theory to Practice. [video][slides]
- Organizing the ALT 2019 workshop: When Smaller Sample Sizes Suffice for Learning.
- On the organizing committee of the 2019 Midwest Machine Learning Symposium.
- Speaking at the 2019 DALI conference in South Africa
- Fall 2018 I joined the Toyota Technological Institute at Chicago (TTIC) as a Research Assistant Professor.
- Spring 2018 I taught ORF 525 "Statistical Learning and Nonparametric Estimation" at Princeton University.
- My ICML 2007 paper "A Bound on the Label Complexity of Agnostic Active Learning" received Honorable Mention for the ICML 2017 Test of Time Award.
- Program Committee Chair (with Lev Reyzin) for the 28th International Conference on Algorithmic Learning Theory (ALT 2017), held October 15-17 in Kyoto, Japan. See our published proceedings.
- New manuscript "Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes" posted to the arXiv.
- Presentation at the Simons Institute Workshop on Interactive Learning. See the video.
- Presentation at the Lorentz Center Workshop on Theoretical Foundations for Learning from Easy Data.
Teaching:
Spring 2018: ORF 525, Statistical Learning and Nonparametric Estimation.
Spring 2012: 36-752, Advanced Probability Overview.
Fall 2011: 36-755, Advanced Statistical Theory I.
Spring 2011: 36-752, Advanced Probability Overview.
Fall 2010 Mini 1: 36-781, Advanced Statistical Methods I: Active Learning
Fall 2010 Mini 2: 36-782, Advanced Statistical Methods II: Advanced Topics in Machine Learning Theory
Spring 2010:
36-754, Advanced Probability II: Stochastic Processes.
Fall 2009: 36-752, Advanced Probability Overview.
At ALT 2010 and the 2010 Machine Learning Summer School in Canberra, Australia,
I gave a tutorial on the theory of active learning. [slides]
A Survey of Theoretical Active Learning:
Theory of Active Learning.
[pdf][ps]
This is a survey of some of the recent advances in the
theory of active learning, with particular emphasis on label complexity
guarantees for disagreement-based methods.
I will be updating and expanding this survey every few years
as this area continues to develop;
the current version (v1.1) was updated on September 22, 2014.
A few relevant recent active learning papers not yet discussed in the survey:
[ZC14],
[WHE-Y15],
[HY15].
An abbreviated version of this survey appeared in the
Foundations
and Trends in Machine Learning series,
Volume 7, Issues 2-3, 2014.
Selected Recent Works:
Montasser, O., Hanneke, S., and Srebro, N. (2019). VC Classes are Adversarially Robustly Learnable, but Only Improperly. In Proceedings of the 32nd Annual Conference on Learning Theory (COLT).
Hanneke, S. (2017). Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes. Under Review.
Hanneke, S. (2016). The Optimal Sample Complexity of PAC Learning. Journal of Machine Learning Research, Vol. 17 (38), pp. 1-15.
Hanneke, S. and Yang, L. (2015). Minimax Analysis of Active Learning. Journal of Machine Learning Research, Vol. 16 (12), pp. 3487-3602.
Hanneke, S. (2012). Activized Learning: Transforming Passive to Active with Improved Label Complexity. Journal of Machine Learning Research, Vol. 13 (5), pp. 1469-1587.
Articles in Preparation:
Nonparametric Active Learning, Part 1: Smooth Regression Functions. [pdf][ps].
Nonparametric Active Learning, Part 2: Smooth Decision Boundaries.
Learning Whenever Learning is Possible: Universal Learning under General Stochastic Processes. [pdf][ps][arXiv].
Active Learning with Identifiable Mixture Models. Joint work with Vittorio Castelli and Liu Yang.
Universal Bayes Consistency in Metric Spaces. Joint work with Aryeh Kontorovich, Sivan Sabato, and Roi Weiss. [pdf][arXiv].
All Publications: (authors are listed in alphabetical order, except sometimes a student author is listed first).
2019
Hanneke, S. and Yang, L. (2019). Surrogate Losses in Passive and Active Learning. Electronic Journal of Statistics, Vol. 13 (2), pp. 4646-4708. [pdf][ps][journal page][arXiv].
Hanneke, S. and Kpotufe, S. (2019). On the Value of Target Data in Transfer Learning. In Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS). [pdf][official page].
Hanneke, S. and Kontorovich, A. (2019). Optimality of SVM: Novel Proofs and Tighter Bounds. Theoretical Computer Science. Volume 796, Pages 99-113. [pdf][journal page]
Montasser, O., Hanneke, S., and Srebro, N. (2019).
VC Classes are Adversarially Robustly Learnable, but Only Improperly.
In Proceedings of the 32nd Annual Conference on Learning Theory (COLT).
[pdf][official page][arXiv]
Winner of a Best Student Paper Award.
Hanneke, S. and Kontorovich, A. (2019). A Sharp Lower Bound for Agnostic Learning with Sample Compression Schemes. In Proceedings of the 30th International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv]
Hanneke, S., Kontorovich, A., and Sadigurschi, M. (2019). Sample Compression for Real-Valued Learners. In Proceedings of the 30th International Conference on Algorithmic Learning Theory (ALT). [pdf][arXiv]
Hanneke, S. and Yang, L. (2019). Statistical Learning under Nonstationary Mixing Processes. In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][arXiv]
2018
Hanneke, S. and Yang, L. (2018). Testing Piecewise Functions. Theoretical Computer Science, Vol. 745, pp. 23-35. [pdf][ps][journal page][arXiv]
Zhivotovskiy, N. and Hanneke, S. (2018). Localization of VC Classes: Beyond Local Rademacher Complexities.
Theoretical Computer Science, Vol. 742, pp. 27-49.
[pdf][ps][journal page][arXiv]
(Special Issue for ALT 2016; Invited)
Hanneke, S., Kalai, A., Kamath, G., and Tzamos, C. (2018). Actively Avoiding Nonsense in Generative Models. In Proceedings of the 31st Annual Conference on Learning Theory (COLT). [pdf][official page][arXiv]
Yang, L., Hanneke, S., and Carbonell, J. (2018).
Bounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks.
Theoretical Computer Science, Vol. 716, pp. 124-140.
[pdf][ps][journal page][arXiv]
(Special Issue for ALT 2015; Invited)
2016
Zhivotovskiy, N. and Hanneke, S. (2016). Localization of VC Classes: Beyond Local Rademacher Complexities. In Proceedings of the 27th International Conference on Algorithmic Learning Theory (ALT). [pdf][ps][arXiv]
Hanneke, S. (2016). Refined Error Bounds for Several Learning Algorithms. Journal of Machine Learning Research, Vol. 17 (135), pp. 1-55. [pdf][ps][arXiv][journal page]
Hanneke, S. (2016). The Optimal Sample Complexity of PAC Learning. Journal of Machine Learning Research, Vol. 17 (38), pp. 1-15. [pdf][ps][arXiv][journal page]
2015
Hanneke, S. and Yang, L. (2015). Minimax Analysis of Active Learning. Journal of Machine Learning Research, Vol. 16 (12), pp. 3487-3602. [pdf][ps][arXiv][journal page]
Hanneke, S., Kanade, V., and Yang, L. (2015).
Learning with a Drifting Target Concept.
In Proceedings of the 26th International Conference on Algorithmic Learning Theory (ALT).
[pdf][ps][arXiv]
See also this note on a result for the sample complexity of efficient agnostic learning implicit in the above concept drift paper:
[pdf]
Yang, L., Hanneke, S., and Carbonell, J. (2015). Bounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks. In Proceedings of the 26th International Conference on Algorithmic Learning Theory (ALT). [pdf][ps][arXiv]
Wiener, Y., Hanneke, S., and El-Yaniv, R. (2015). A Compression Technique for Analyzing Disagreement-Based Active Learning. Journal of Machine Learning Research, Vol. 16 (4), pp. 713-745. [pdf][ps][arXiv][journal page]
2014
Hanneke, S. (2014). Theory of Disagreement-Based Active Learning. Foundations and Trends in Machine Learning, Vol. 7 (2-3), pp. 131-309. [official] [Amazon]
There is also an extended version, which I update from time to time.
2013
Yang, L. and Hanneke, S. (2013). Activized Learning with Uniform Classification Noise. In Proceedings of the 30th International Conference on Machine Learning (ICML). [pdf][ps][appendix pdf][appendix ps]
Yang, L., Hanneke, S., and Carbonell, J. (2013). A Theory of Transfer Learning with Applications to Active Learning. Machine Learning, Vol. 90 (2), pp. 161-189. [pdf][ps][journal page]
2012
Balcan, M.-F. and Hanneke, S. (2012). Robust Interactive Learning. In Proceedings of the 25th Annual Conference on Learning Theory (COLT).[pdf][ps][arXiv]
Hanneke, S. (2012). Activized Learning: Transforming Passive to Active with Improved Label Complexity.
Journal of Machine Learning Research,
Vol. 13 (5), pp. 1469-1587.
[pdf][ps][arXiv][journal page]
Related material: extended abstract, Chapter 4 in my thesis,
and various presentations
[slides][video].
2011
Yang, L., Hanneke, S., and Carbonell, J. (2011). Identifiability of Priors from Bounded Sample Sizes with Applications to Transfer Learning. In Proceedings of the 24th Annual Conference on Learning Theory (COLT).[pdf][ps][video]
Yang, L., Hanneke, S., and Carbonell, J. (2011). The Sample Complexity of Self-Verifying Bayesian Active Learning. In Proceedings of the 14th International Conference on Artificial Intelligence and Statistics (AISTATS).[pdf][ps]
Hanneke, S. (2011). Rates of Convergence in Active Learning. The Annals of Statistics, Vol. 39 (1), pp. 333-361. [pdf][ps][journal page]
2010
Yang, L., Hanneke, S., and Carbonell, J. (2010).
Bayesian Active Learning Using Arbitrary Binary Valued Queries.
In Proceedings of the 21st International Conference on Algorithmic Learning Theory (ALT).[pdf][ps]
Also available in information theory jargon. [pdf][ps]
Hanneke, S., Fu, W., and Xing, E.P. (2010). Discrete Temporal Models of Social Networks. The Electronic Journal of Statistics, Vol. 4, pp. 585-605. [pdf][journal page]
Hanneke, S. and Yang, L. (2010). Negative Results for Active Learning with Convex Losses. Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS). [pdf][ps]
Balcan, M.-F., Hanneke, S., and Wortman Vaughan, J. (2010).
The
True Sample Complexity of Active Learning. Machine
Learning, Vol. 80 (2-3), pp. 111-139.
[pdf][ps][journal page]
(Special Issue for COLT 2008; Invited)
2009
Hanneke, S. (2009). Theoretical Foundations of Active Learning. Doctoral Dissertation. Machine Learning Department. Carnegie Mellon University. [pdf][ps][defense slides]
Hanneke, S. (2009).
Adaptive Rates of Convergence in Active Learning.
In Proceedings of the 22nd Annual Conference on Learning Theory (COLT).[pdf][ps][slides]
Also available in expanded journal version.
Hanneke, S. and Xing, E.P. (2009). Network Completion and Survey Sampling. In Proceedings of the 12th International Conference on Artificial Intelligence and Statistics (AISTATS).[pdf][ps][slides]
2008
Balcan, M.-F., Hanneke, S., and Wortman, J. (2008).
The True Sample Complexity of Active Learning.
In Proceedings of the 21st Annual Conference on Learning Theory (COLT).
[pdf][ps][slides]
Winner of the Mark Fulk Best Student Paper Award.
Also available in an extended journal version.
2007
Balcan, M.-F., Even-Dar, E., Hanneke, S., Kearns, M., Mansour, Y., and Wortman, J. (2007).
Asymptotic Active Learning.
NIPS Workshop on Principles of Learning Problem Design.
[pdf][ps][spotlight slide]
Also available in improved
conference version
and expanded
journal version.
Hanneke, S. and Xing, E.P. (2007).
Network Completion and Survey Sampling.
NIPS Workshop on Statistical Network Models.
See our later conference publication.
Hanneke, S. (2007). Teaching Dimension and the Complexity of Active Learning. In proceedings of the 20th Annual Conference on Learning Theory (COLT). [pdf][ps][slides]
Hanneke, S. (2007).
A Bound on the Label Complexity of Agnostic Active Learning.
In proceedings of the 24th Annual International Conference on Machine Learning (ICML).
[pdf][ps][slides]
Honorable Mention for the ICML 2017 Test of Time Award.
Guo, F., Hanneke, S., Fu, W., and Xing, E.P. (2007).
Recovering Temporally Rewiring Networks:
A Model-based Approach.
In proceedings of the 24th Annual International Conference on
Machine Learning (ICML). [pdf]
Also see our related earlier work.
Hanneke, S. (2007).
The Complexity of Interactive Machine Learning.
KDD Project Report (aka Master's Thesis).
Machine Learning Department, Carnegie Mellon University.
[pdf][ps][slides]
Includes some interesting results from a class project on
The Cost Complexity of Interactive Learning, in addition to
my COLT07 and ICML07 papers.
2006
Hanneke, S. and Xing, E.P. (2006).
Discrete Temporal Models of Social Networks.
In Proceedings of the ICML Workshop on Statistical Network Analysis.
[pdf][ps][slides]
Also available in an extended journal version
Hanneke, S. (2006). An Analysis of Graph Cut Size for Transductive Learning. In Proceedings of the 23rd International Conference on Machine Learning (ICML). [pdf][ps][slides ppt][slides pdf]