**Contact Information:**

Email: steve.hanneke@gmail.com

Mobile Phone: (412) 973-3007

Location: Princeton, NJ USA

I am an independent scientist, working on topics in statistical learning theory.

**Research Interests:**

My general research interest is in systems that can improve their performance with experience, a
topic known as machine learning. My focus is on the informational
complexity of machine learning. The essential questions I am interested
in answering are "what can be learned from empirical observation and/or
interaction," and "how much observation and/or interaction is necessary to
learn it?"
This overall topic intersects with several academic disciplines,
including statistical learning theory, statistical inference, algorithmic and statistical
information theories, philosophy of science, and epistemology.

**Brief Bio:**

From 2009 to 2012,
I was a Visiting Assistant Professor in the
Department of Statistics at
Carnegie Mellon University,
also affiliated with the Machine Learning Department.
I have recently been taking time away from formal employment to focus entirely on research.
I received my PhD in 2009 from the
Machine Learning Department at
Carnegie Mellon University,
co-advised by Eric Xing
and Larry Wasserman.
My thesis work was on the theoretical foundations of active learning.
Prior to that, I was an undergraduate studying
Computer Science
at the University of Illinois at Urbana-Champaign
(UIUC),
where I studied semi-supervised learning with Prof.
Dan Roth
and the students in the Cognitive Computation Group.
Prior to that, I studied Computer Science at Webster University
in St. Louis, MO,
where I played around with neural networks
and classic AI a bit.

**Teaching:**

Spring 2012: 36-752, Advanced Probability Overview.

Fall 2011: 36-755, Advanced Statistical Theory I.

Spring 2011: 36-752, Advanced Probability Overview.

Fall 2010 Mini 1: 36-781, Advanced Statistical Methods I: Active Learning

Fall 2010 Mini 2: 36-782, Advanced Statistical Methods II: Advanced Topics in Machine Learning Theory

Spring 2010:
36-754, Advanced Probability II: Stochastic Processes.

Fall 2009: 36-752, Advanced Probability Overview.

At ALT 2010 and the 2010 Machine Learning Summer School in Canberra, Australia,
I gave a tutorial on the theory of active learning. [slides]

**A Survey of Theoretical Active Learning:**

*Theory of Active Learning*.
[pdf][ps]

This is a survey of some of the recent advances in the
theory of active learning, with particular emphasis on label complexity
guarantees for disagreement-based methods.

Note: I will be updating and expanding this survey
as this area continues to develop;
the current version (v1.1) was updated on September 22, 2014.

An abbreviated version appeared in the
Foundations
and Trends in Machine Learning series,
Volume 7, Issues 2-3, 2014.

**Unpublished Notes:**

I have some "working notes" that may be of interest to some people. These include articles under review and unpublished results, which may eventually become papers.

These notes are subject to frequent updates and changes -- mostly additions -- as I continue to explore these topics.

*The Optimal Sample Complexity of PAC Learning*. [pdf][ps][arXiv]

*Refined Error Bounds for Several Learning Algorithms*. [pdf][ps]

*Surrogate Losses in Passive and Active Learning*. [pdf][ps][arXiv]. Joint work with Liu Yang.

**Publications:**
*(authors are listed in alphabetical order).*

**2015**

Hanneke, S. and Yang, L. (2015).
*Minimax Analysis of Active Learning*.
To appear in the Journal of Machine Learning Research.
[pdf][ps][arXiv]

Hanneke, S., Kanade, V., and Yang, L. (2015).
*Learning with a Drifting Target Concept*.
In Proceedings of the 26^{th} International Conference on Algorithmic Learning Theory (ALT).
[pdf][ps][arXiv]

Carbonell, J., Hanneke, S., and Yang, L. (2015).
*Bounds on the Minimax Rate for Estimating a Prior over a VC Class from Independent Learning Tasks*.
In Proceedings of the 26^{th} International Conference on Algorithmic Learning Theory (ALT).
[pdf][ps][arXiv]

El-Yaniv, R., Hanneke, S., and Wiener, Y. (2015).
*A Compression Technique for Analyzing Disagreement-Based Active Learning*.
Journal of Machine Learning Research, Vol. 16 (4), pp. 713-745.
[pdf][ps][arXiv]

**2014**

Hanneke, S. (2014). *Theory of Disagreement-Based Active Learning*. Foundations and Trends in Machine Learning, Vol. 7 (2-3), pp. 131-309. [official] [Amazon]

There is also an extended version, which I update from time to time.

**2013**

Hanneke, S. and Yang, L. (2013).
*Activized Learning with Uniform Classification Noise*.
In Proceedings of the 30^{th} International Conference on Machine Learning (ICML).
[pdf][ps][appendix pdf][appendix ps]

Carbonell, J., Hanneke, S., and Yang, L. (2013).
*A Theory of Transfer Learning with Applications to Active Learning*. Machine Learning, Vol. 90 (2), pp. 161-189. [pdf][ps][journal page]

**2012**

Balcan, M.-F. and Hanneke, S. (2012).
*Robust Interactive Learning*.
In Proceedings of the 25^{th} Annual Conference on Learning Theory (COLT).[pdf][ps][arXiv]

Hanneke, S. (2012). *Activized Learning: Transforming Passive to Active with Improved Label Complexity.*
Journal of Machine Learning Research,
Vol. 13 (5), pp. 1469-1587.
[pdf][ps][arXiv][journal page]

Related material: extended abstract, Chapter 4 in my thesis,
and various presentations
[slides][video].

**2011**

Carbonell, J., Hanneke, S., and Yang, L. (2011).
*Identifiability of Priors from Bounded Sample Sizes with Applications to Transfer Learning.*
In Proceedings of the 24^{th} Annual Conference on Learning Theory (COLT).[pdf][ps]

Carbonell, J., Hanneke, S., and Yang, L. (2011).
*The Sample Complexity of Self-Verifying Bayesian Active Learning.*
In Proceedings of the 14^{th} International Conference on Artificial Intelligence and Statistics (AISTATS).[pdf][ps]

Hanneke, S. (2011).
*
Rates of Convergence in Active Learning*.
The Annals of Statistics, Vol. 39 (1), pp. 333-361. [pdf][ps][journal page]

**2010**

Carbonell, J., Hanneke, S., and Yang, L. (2010).
*Bayesian Active Learning Using Arbitrary Binary Valued Queries.*
In Proceedings of the 21^{st} International Conference on Algorithmic Learning Theory (ALT).[pdf][ps]

Also available in information theory jargon. [pdf][ps]

Fu, W., Hanneke, S., and Xing, E.P. (2010).
*Discrete
Temporal Models of Social Networks.* The Electronic Journal
of Statistics, Vol. 4, pp. 585-605.
[pdf]

Hanneke, S. and Yang, L. (2010).
*Negative Results for Active Learning with Convex Losses.*
Proceedings of the 13^{th} International Conference on Artificial Intelligence and Statistics (AISTATS).
[pdf][ps]

Balcan, M.-F., Hanneke, S., and Wortman Vaughan, J. (2010).
*The
True Sample Complexity of Active Learning.* Machine
Learning, Vol. 80 (2-3), pp. 111-139.
[pdf][ps]

**2009**

Hanneke, S. (2009).
*Theoretical Foundations of Active Learning.*
Doctoral Dissertation. Machine Learning Department. Carnegie Mellon University.
[pdf][ps][defense slides]

Hanneke, S. (2009).
*Adaptive Rates of Convergence in Active Learning.*
In Proceedings of the 22^{nd} Annual Conference on Learning Theory (COLT).[pdf][ps][slides]

Also available in expanded journal version.

Hanneke, S. and Xing, E.P. (2009).
*Network Completion and Survey Sampling.*
In Proceedings of the 12^{th} International Conference on Artificial Intelligence and Statistics (AISTATS).[pdf][ps][slides]

**2008**

Balcan, M.-F., Hanneke, S., and Wortman, J. (2008).
*The True Sample Complexity of Active Learning.*
In Proceedings of the 21^{st} Annual Conference on Learning Theory (COLT).
[pdf][ps][slides]

*Winner of the Mark Fulk Best Student Paper Award.*

Also available in an extended journal version.

**2007**

Balcan, M.-F., Even-Dar, E., Hanneke, S., Kearns, M., Mansour, Y., and Wortman, J. (2007).
*Asymptotic Active Learning.*
NIPS Workshop on Principles of Learning Problem Design.
[pdf][ps][spotlight slide]

Also available in improved
conference version
and expanded
journal version.

Hanneke, S. and Xing, E.P. (2007).
*Network Completion and Survey Sampling.*
NIPS Workshop on Statistical Network Models.

See our later conference publication.

Hanneke, S. (2007).
*Teaching Dimension and the Complexity of Active Learning.*
In proceedings of the 20^{th} Annual Conference on Learning Theory (COLT).
[pdf][ps][slides]

Hanneke, S. (2007).
*A Bound on the Label Complexity of Agnostic Active Learning.*
In proceedings of the 24^{th} Annual International Conference on Machine Learning (ICML).
[pdf][ps][slides]

Fu, W., Guo, F., Hanneke, S., and Xing, E.P. (2007).
*Recovering Temporally Rewiring Networks:
A Model-based Approach.*
In proceedings of the 24^{th} Annual International Conference on
Machine Learning (ICML). [pdf]

Also see our related earlier work.

Hanneke, S. (2007).
*
The Complexity of Interactive Machine Learning.*
KDD Project Report (aka Master's Thesis).
Machine Learning Department, Carnegie Mellon University.
[pdf][ps][slides]

Includes some interesting results from a class project on
*The Cost Complexity of Interactive Learning*, in addition to
my COLT07 and ICML07 papers.

**2006**

Hanneke, S. and Xing, E.P. (2006).
*Discrete Temporal Models of Social Networks.*
In Proceedings of the ICML Workshop on Statistical Network Analysis.
[pdf][ps][slides]

Also available in an extended journal version

Hanneke, S. (2006).
*An Analysis of Graph Cut Size for Transductive Learning.*
In Proceedings of the 23^{rd} International Conference on Machine Learning (ICML).
[pdf][ps][slides ppt][slides pdf]