Campuses:

Simple On-line Learning Algorithms: Multiplicative versus Additive Updates

Saturday, March 7, 1998 - 11:55am - 12:20pm
Keller 3-180
Manfred K Warmuth (University of California)
The goal is to design simple learning algorithms for the on-line estimation of parameters in non-stationary settings. An example would be to estimate the car rate at a traffic light.

Gradient descent is the standard heuristic for designing iterative learning algorithms with the aim of minimizing some loss function. In this heuristic the new value of a parameter equals the old value plus a constant times the derivative (gradient) of the loss function w.r.t. the parameters. We discuss new heuristics in which the gradients are used in a different way, leading to a multiplicative update of the parameters.

The multiplicative updates are good when the input dimension is large (data mining applications). They also can adapt quickly when the inputs are changing with time. Techniques from convex optimization, statistics, and control theory are used to derive and analyze the algorithms.