The Computational Principles of Learning



The Computational Principles of Learning

How can computers learn and gain new ideas from data, adapt to their users, and improve performance with experience?


Many of the problems we want computers to solve are no longer tasks we know how to explicitly tell a computer how to do: discovering new patterns in medical and biological data, identifying faces in images, predicting what items a user will need next, reducing waste in resource usage, and even detecting and deleting spam. As a result, Machine Learning – the learning by computers of decision rules and patterns from data – has become increasingly crucial to computing applications. In addition, we need systems that can adapt to changing conditions, that will be user-friendly by adapting to needs of their individual users, and that improve performance over time. Addressing this challenge and creating the next generation of learning systems requires fundamentally advancing our understanding of the computational principles of learning as an algorithmic and informational process.


The area of Computational Learning Theory has formed to develop and grow precisely this kind of fundamental understanding: combining algorithmic and statistical approaches to characterize what kinds of rules and patterns can be reliably learned from what kinds of data, and to understand the underlying computational and informational building-blocks of learning. This research has produced a number of powerful tools getting at the heart of this question. These include techniques such as VC-dimension and Rademacher Complexity for analyzing how much data is needed to make confident conclusions, powerful algorithms such as SVMs, kernel methods, and exponential weighting (among many others) for efficiently learning natural rules from data, and techniques such as Boosting for converting complicated learning tasks into a series of simpler, more manageable ones. In addition, research has been developing an improved understanding of the substantial role that auxiliary information such as unlabeled data can play in the learning process, as well as the power of methods such as active learning for learning quickly from limited data.

Yet many key challenges remain. These include both fundamental computational problems within existing and well-understood learning models, as well as the challenge of developing new frameworks to guide the development of new generations of learning algorithms. These together will enable algorithms that can best use data, interaction, related learned knowledge, and other forms of guidance for efficient, accurate learning of complex tasks.

Contributors and Credits

Avrim Blum




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: