HOME    »    PROGRAMS/ACTIVITIES    »    Annual Thematic Program
Talk Abstracts
IMA "Hot Topics" Workshop
Data-driven Control and Optimization
December 4-6, 2002


2002-2003 Program: Optimization

S. Massoud Amin (Electric Power Research Institute (EPRI), 3412 Hillview Avenue, Palo Alto, CA 94304-1395, USA)  mamin@epri.com  http://rodin.wustl.edu/~massoud/amin.html

Impact of Data-based Modeling on Electricity Infrastructure Operations and Security Applications    Slides for the Talk:    pdf    Slides for the Panel Discussion:    pdf

Virtually every crucial economic and social function depends on the secure, reliable operation of energy, telecommunications, transportation, financial, and other infrastructures. The Internet, computer networks, and our digital economy have increased the demand for reliable and disturbance-free electricity; banking and finance depends on the robustness of electric power, cable, and wireless telecommunications. With dramatic increases in inter-regional bulk power transfers and accelerating diversity of transactions among parties, the electric power grid is being used in ways for which it was not originally designed. Grid congestion and atypical power flows are increasing, while customer expectations of reliability are rising to meet the needs of a pervasively digital world.

Furthermore, as the power grids become heavily loaded with long distance transfers, the already complex system dynamics become even more important. The potential for rare-events but high-impact cascading phenomena represent just a few of many new science and technology concepts that are under development. Analysis and modeling of interdependent infrastructures (e.g. the electric power, together with protection systems, telecommunications, oil/gas pipelines and energy markets) is especially pertinent. This presentation will focus on a strategic vision extending to a decade, or longer, for a data-based paradigm that would enable more secure and robust systems operation, security monitoring and efficient energy markets.

Andrew R. Barron (Department of Statistics, Yale University)  andrew.barron@yale.edu

Probability Theory of Compounding Wealth and Universal Portfolio Estimation

We review roles of probability theory, statistics, and information theory in examining compounding wealth in gambling and stock market settings. Whereas maximizing conditional expected log return produces the highest long-run growth rate of wealth, we show that in contrast maximizing other common utilities expressed by power laws optimize large deviation probabilities (maximize the slim chance of unusually large returns). Universal portfolios, their relationship to Bayes strategies, and their minimax characteristics are reviewed. Building on past work by Tom Cover and his colleagues, universal portfolios are determined that have nearly minimal maximum regret compared to the best with hindsight (in customary classes of portfolio strategies) uniformly over all possible stock price sequences. Strategies for estimation and computation of universal portfolios are discussed.

Joe H. Chow (Department of Electrical & Computer Systems Engineering, Rensselaer Polytechnic Institute (RPI))

Optimization and Risk Management in Open-Access Electric Energy Markets    Slides:    pdf

In an open-access electric energy market, electricity suppliers and load serving entities are allowed to trade energy and provide bids into a daily energy market. The power system is managed by an independent system operator, who usually has the role of administrating the energy market as well. Suppliers and loads need to optimize their expected energy position as well as any real-time variation, under a limited information structure imposed by the independent system operator. Thus the optimization has to be based on past data predicting future market conditions. The talk will provide an overview of open-access electric energy markets as well as some research on optimal bidding strategies by Ning Lu, a PhD candidate at RPI.

George Cybenko (Dartmouth College)  gvc@dartmouth.edu  http://www.dartmouth.edu/~gvc

Dynamic Dynamical Systems    Slides:    html    pdf    ppt

A new class of control problems are emerging in which the state space of the system changes dynamically. This presents two novel challenges: how to dynamically define these changes and; how to develop effective controls for dealing with systems that dynamically change. This talk will present examples and ongoing work to address both challenges.

Joao P. Hespanha (Center for Control Engineering and Computation, University of California, Santa Barbara)  hespanha@ece.ucsb.edu

Complexity Issues in Probabilistic Mapping    Slides:    pdf

This talk addresses the issue of estimating the positions of a group of objects using a stream of noisy sensor measurements. This is often called probabilistic mapping. From a formal point of view, probabilistic maps are just the probability densities of object positions, conditioned to the available sensor measurements. In this talk we will explore issues related to the computational complexity of constructing probabilistic maps and also utilizing them in the context of path planning.

Arthur Kordon (Research Leader, Corporate R&D, The Dow Chemical Company)  AKKordon@dow.com

Hybrid Intelligent Systems for Data-Driven Monitoring and Optimization    Slides:    pdf

A novel approach for data-driven modeling based on integration of four key computational intelligence approaches (genetic programming, analytical neural networks, support vector machines, and particle swarm optimizers) is proposed. The integrated methodology amplifies the advantages of the individual techniques, significantly reduces the development time, and delivers robust empirical models with low maintenance cost. The advantages of the proposed methodology for data-driven monitoring and optimization will be illustrated with several successful applications in The Dow Chemical Company.

Rudolf Kulhavy (Honeywell ACS Advanced Technology)  kulhavy@htc.honeywell.cz

Data-driven Decision-making: The Good, the Bad, and the Ugly    Slides:    html    pdf    pps

The overwhelming amount of data stored in databases gives sometimes rise to exaggerated expectations. One of the popular myths is that a large amount of data carries necessarily a large amount of information. It is clearly not so^√ódata stored in databases is often redundant or showing just a couple of patterns from the multitude of all possible patterns of the process behavior. Very rarely the data collected is the result of a planned experiment, rather it is a series of snapshots of routine operation. What is so exciting then about the massive data sets available to us today? It is not that a huge amount of data can replace the domain knowledge and the art of modeling. It is that for the first time we have the whole process history at disposal to make decisions affecting the future behavior. This makes database-centric decision-making an exciting alternative to the current paradigms. The presentation discusses opportunities and challenges presented by the new paradigm. Special attention is paid to selection of a "data cube" capturing multi-dimensional data, definition of "similar" historical data points, and similarity search in high-dimensional spaces, while sharing experience from real-life applications of data-centric decision support systems.

Steffen L. Lauritzen (Department of Mathematical Sciences, Aalborg University)  steffen@math.auc.dk  http://www.math.auc.dk/~steffen

LIMIDs - Representing and Solving Decision Problems with Limited Information    Slides:    pdf

The notion of a Limited Memory Influence Diagram (LIMID) is introduced as a Bayesian network augmented with nodes representing decisions and utility functions. For each decision it is specified what information is available at the time when the decision is to be made. In contrast with traditional influence diagrams, the assumption of no forgetting is relaxed, and there is no additional constraints on the order in which decisions are to be taken. This allows for multiple decision makers and decision makers with limited memory, and reduces complexity of strategies. We give a local computation algorithm for finding locally optimal policies, conditions for the policies to be globally optimal, and indicate how this can be exploited to obtain bounds for the loss of utility, for example in partially observed Markov decision processes (POMDPs). The lecture is largely based upon:

Lauritzen, S.L. and Nilsson, D. (2001). Representing and Solving Decision Problems with Limited Information, Management Science, 47, 1238-1251.

Can be obtained from http://www.math.auc.dk/~steffen/papers/limids.pdf

Jay H. Lee ( School of Chemical Engineering, Georgia Institute of Technology)  jay.lee@che.gatech.edu

Simulation Based Approximation of Value Function for Process Control    Slides:    pdf

Although model predictive control (MPC) has firmly etched itself in process control practice, its large on-line computational demand and inability to rigorously consider information feedback under uncertainty limits its usage in complex systems, which are characterized by multi-scale, nonlinear, hybrid dynamics and significant uncertainties. In this talk, we propose an alternative approach based on the infinite horizon cost-to-go (the 'value function'). The key issue lies in obtaining an accurate approximation of the value function for the relevant regions of state space. We propose to build an approximation using simulation data and improve it iteratively through policy or value iteration and additional simulation. We demonstrate the efficacy of the approach on two different bioreactor optimal control problems. Along the way, we also point out some critical issues and outstanding theoretical problems.

Susan A. Murphy (Department of Statistics, Quantitative Methodology Program, University of Michigan)  samurphy@umich.edu  http://www.stat.lsa.umich.edu/~samurphy/

Dynamic Treatment Regimes for Chronic, Relapsing Disorders

The management of chronic, relapsing disorders can be viewed as a control problem in that multi-stage treatment decisions are made with the goal of optimizing mean response. For example, in the prevention of relapse by recovering alcoholics, the response might be percent days abstinent and the treatment decisions might be, which preventative treatment should be used initially, how long should we wait to declare the initial treatment ineffective and switch to a secondary treatment, which secondary treatment should be used, when should treatment be stopped, etc. These treatment decisions would be made on the basis of time varying covariates such as number of days heavy drinking, measures of craving, measures of stress, patient preference and results of urinalyses.

A important open problem in this area is how we might use a batch of data, i.e., a longitudinal sample of individuals for whom both response, covariates and treatment decisions are recorded for each time period, so as to estimate the optimal decision rules. This challenging area is characterized by delayed effects of treatment, an unknown model relating past treatment and covariates to future covariates and a high noise to signal ratio.

Gregory Piatetsky-Shapiro (KDnuggets)  gps@kdnuggets.com

Knowledge Discovery in Microarray Gene Expression Data
Slides:    html    pdf    ppt

DNA Microarrays are revolutionizing molecular biology, allowing simultaneous analysis of many thousands of genes. Microarray hold the promise of important applications, including creating novel, genetic-based diagnostic tests, finding new molecular targets for therapy, and developing personalized treatments.

Microarrays allow analysis of dynamic processes and deeper insight into biological pathways.

However, the large number of genes and a typically small number of samples, present unique challenges for DNA microarray data analysis. We discuss issues in normalization of microarray data, selecting the best set of genes for classification and clustering, randomization techniques, and building classification and clustering models.

We illustrate these processes using a number of software tools and show new results with potential biological significance.

Maria Prandini (Department of Electronics for Automation, University of Brescia, Italy)  prandini@ing.unibs.it

Cautious Hierarchical Switching Control of Stochastic Linear Systems (Poster Session)

We address the problem of controlling an unknown stochastic linear system and propose a new methodology that incorporates the advantages of cautious stochastic control and switching control in a hierarchical scheme. The design of cautious switching controllers is based on the following two-step procedure: i) a probability measure describing the likelihood of different models is updated on-line based on observations; and ii) at each switching time, the controller in the candidate controller set that optimizes a certain average control cost with respect to the updated probability measure is selected. If a certain structured set of candidate controllers is used in the above cautious switching scheme, then a controller is automatically chosen that suitably compromises performance against robustness. Randomized algorithms are used to make the controller selection computationally tractable.

This is a joint work with M.C. Campi and J.P. Hespanha.

Daniel E. Rivera (Department of Chemical and Materials Engineering, Arizona State University)  daniel.rivera@asu.edu  http://www.eas.asu.edu/~csel/rivera.html

Model-on-Demand Estimation for Improved Identification and Control of Process Systems    Slides:    pdf

In recent years we have been pursuing the concept of nonlinear identification and control through a data-driven framework named Model-on-Demand (MoD). The MoD approach enhances traditional local modeling and provides the potential for performance rivaling global methods (such as NARX models and neural networks) while involving substantially less detailed knowledge of model structure from the user and much more reliable numerical computations.

Research in our laboratory (performed in collaboration with the Division of Automatic Control at Linkoping University, Sweden) has focused on demonstrating the MoD estimation framework as an effective, practical means for modeling and controlling nonlinear process systems. Research topics have included such diverse problems as MoD-based automated smoothing of empirical transfer function estimates (ETFEs), systematic design of databases for MoD estimation using multi-level pseudo-random and minimum crest factor multisine input signals, and the development of a comprehensive MoD-based Predictive Control methodology. A Matlab-based tool for MoD estimation and control, developed in our laboratory in collaboration with Linkoping researchers, is available in the public domain.

The presentation will describe our general experiences with MoD estimation in each of these topical areas. Some pressing challenges and open issues in the application of MoD estimation will be discussed. The talk will conclude with a summary of current activities, among these the application of MoD-based estimation and control to inventory management in supply chains.

Tariq Samad (Honeywell Automation and Control Solutions)  tariq.samad@honeywell.com  http://www.htc.honeywell.com/people/tariq_samad

High-Confidence Control: Ensuring Reliability in High-Performance Real-Time Systems    Slides:    html    pdf    ppt

Technology transfer is an especially difficult proposition for real-time control. To facilitate it, we need to complement the "high performance" orientation of control research with an emphasis on demonstrating "high confidence" in real-time implementation. We focus on a particular problem in this context: Complex algorithms have unpredictable computational characteristics that nevertheless need to be modeled. Statistical verification is suggested as a possible approach and we are exploring the application of statistical learning theory. A synthesis of control engineering and computer science is required if effective solutions are to be devised.

Concluding Remarks Slides:    html    pdf    ppt

Steve Smale (Department of Mathematics, University of California, Berkeley)  smale@math.berkeley.edu

Fast Algorithms for Dealing with Data and Understanding Them

Recent developments in learning theory help to broaden and deepen methods for analysing data. Non-linear algorithms get replaced by linear ones in high dimensional spaces.

Bruce F. Wollenberg (Department of Electrical and Computer Engineering, University of Minnesota)  wollenbe@ece.umn.edu  http://www.ece.umn.edu/faculty/wollenberg.html

Solving the ISO "Seams" Problem for Uniform Boundary LMP's    Slides:    pdf

The US Department of Energy, Federal Energy Regulators Commission, has released a Standard Market Design (SMD) which introduces the problem of enabling two Independent System Operators (ISO's) which independently calculate the market clearing prices for their respective markets to reach consistent Locational Marginal Prices (LMP's) along a shared boundary. Without consistent LMP's trading across a boundary (seam) can be difficult or impossible. This presentation will focus on the issue of enabling two ISO's to reach a common set of LMP's on the boundary. William Hogan has presented some preliminary work toward a solution wherein multiple ISO's solutions are iterated until a common solution is reached. In Hogan's work, only transmission limit constraints were imposed on the solution. We have extended this to include first contingency constraints as well. The market clearing calculations are done with an Optimal Power Flow (OPF) based on a full Alternating Current model of the power system. The LMP's are the bus power constraint Lagrange multipliers from the solution. Both Hogan's and our own work so far have been with linear networks not with full AC OPF solutions and full AC contingency analysis. The talk will explore many of the difficulties of achieving a common boundary bus LMP when each ISO is using an AC OPF and AC contingency analysis to calculate the LMP's and what research directions we see as promising. The aim of the work we are conducting is to achieve tools for ISO's to enable them to continue to operate independently yet to have uniform LMP's along the boundaries with other ISO's.

 

Materials from Talks

2002-2003 Program: Optimization

Connect With Us:
Go