LOIS LECTURES 


TU/e  research cluster Logistics, Operations, and their Information Systems (LOIS)
 


 

P. Glynn, Stanford University, USA

 

Perspectives on Traffic Modeling

 

Traffic modeling is concerned with modeling the input processes to stochastic models arising in the analysis of supply chains, production systems, communications networks, and many other problem contexts.
Traffic models play a key role in both the analytic study and simulation of such systems.
In this talk, we will discuss some recent developments in this area, and some of the associated issues and challenges that arise in the specification of appropriate traffic models.



A. Gionis, Yahoo! Research, Barcelona

 

 

The community-search problem and how to plan a successful cocktail party

 

A lot of research in graph mining has been devoted in the discovery of communities. Most of the work has focused in the scenario where communities need to be discovered with only reference to the input graph. However, for many interesting applications such as in social- network analysis, query-log analysis, biology, and others, one is interested in finding the community formed by a given set of nodes. In this paper we study a query-dependent variant of the community- detection problem, which we call the community-search problem: given a graph G, and a set of query nodes in the graph, we seek to find a subgraph of G that contains the query nodes and it is densely connected.

We motivate a measure of density based on minimum degree and distance constraints, and we develop an optimum greedy algorithm for this measure. We proceed by characterizing a class of monotone constraints and we generalize our algorithm to compute optimum solutions satisfying any set of monotone constraints. Finally we modify the greedy algorithm and we present two heuristic algorithms that find communities of size no greater than a specified upper bound. Our experimental evaluation on real datasets demonstrates the efficiency of the proposed algorithms and the quality of the solutions we obtain. 

This is joint work with Mauro Sozio

 


 

P. Zipkin, Duke University, Durham, USA

 

Supply Streams

 

A supply stream is a continuous version of a supply chain. It is like a series inventory system, but stock can be held at any point along a continuum, not just at discrete stages. We assume stationary parameters, and we focus on stationary echelon base-stock policies, which are known to be optimal for discrete stages. We show that the solutions to discrete-stage systems converge monotonically to a limit, as the distances between the stages become small, and this limit solves the continuous-stage system. We examine two such models in greater detail, one with Poisson demand and the other with Brownian-motion demand. For Poisson demand we develop tractable computational methods. For Brownian motion we derive a partial differential equation that characterizes the optimal solution. This model, it turns out, is closely related to one describing first-passage times. This linkage leads to some interesting and useful results. Specifically, we find closed-form solutions for certain special cases, the first in the inventory literature.

 


A. Mandelbaum, Technion - Israel Institute of Technology

 

Service Engineering: Data-Based Science in support of Service Management, or
"Empirical Adventures in Call-Centers and Hospitals"

 

 

I shall describe examples of complex service operations for which data-based simple models have been found useful - I refer to this as "Simple Models at the Service of Complex Realities." The examples cover hospitals, banks, courts and more, but most attention will be given to telephone call centers. I view these service systems through the mathematical lenses of Queueing Science, with a bias towards Statistics.

More specifically, through the examples I review an emerging scientific discipline which is now being referred to as Service Engineering. I focus on empirical findings that motivate or are motivated by (or both) interesting research questions. These findings give rise to model-features that are prerequisite for useful service models, for example customers' (im)patience, time-varying service demand (predictable variability), heterogeneity of customers and servers (skills-based routing), over-dispersion in Poisson arrivals, generally-distributed (as opposed to exponential) service- and patience-durations, and more. Empirical analysis also enables validation of existing models and protocols, either supporting or refuting their relevance and robustness.

 

The mathematical framework for my models is asymptotic queueing theory, where limits are taken as the number of servers increases indefinitely, in a way that maintains a delicate balance between staffing level and offered-load. Asymptotic analysis reveals an operational regime that achieves, under already moderate scale, remarkably high levels of both service quality and efficiency. This is the QED regime, discovered by Erlang and substantiated mathematically by Halfin & Whitt. (QED = Quality- & Efficiency-Driven).

 

My main data-source is a unique data repository from call-centers and hospitals. The data is maintained at the Technion's SEE Laboratory (SEE = Service Enterprise Engineering; see http://ie.technion.ac.il/Labs/Serveng/). It is unique in that it is transaction-based: it details the individual operational history of all the service transactions (e.g. calls in a call center or patients in an emergency department). For example, one source of data, publicly available, is a network of 4 call centers of a U.S. bank, spanning 2.5 years and covering about 1000 agents; there are 218,047,488 telephone calls overall, out of which 41,646,142 where served by agents, while the rest were handled by answering machines. The data can be explored via SEEStat, an environment for online data analysis. SEEStat is accessible at http://seeserver.iem.technion.ac.il/see-terminal/, after registration.

 


D. Bertsimas, MIT, January 11, 2010

On the power of robust solutions in multi-stage stochastic and adaptive optimization

This talk explores several advances in multistage stochastic and adaptive optimization. We consider a multi-stage mixed integer stochastic optimization problems and show that a static finitely adaptable solution (it reduces to a robust solution in two stages) that can be computed in polynomial time is a good approximation to the fully-adaptable mutli-stage solution for the stochastic problem under fairly general assumptions on the uncertainty set and the probability distribution. In particular, we show that if the right hand side of the constraints is uncertain and belongs to a symmetric uncertainty set (such as hypercube, ellipsoid or norm-ball) and the probability measure is also symmetric, then the cost of the optimal fixed solution to the corresponding robust problem is at most twice the optimal expected cost of the multistage-stage stochastic problem. Furthermore, we show that the bound is tight for symmetric uncertainty sets and can be arbitrarily large if the uncertainty set is not symmetric. We refer to the ratio of the optimal cost of the robust problem and the optimal cost of the two-stage stochastic problem as the stochasticity gap. We also extend the bound on the stochasticity gap for another class of uncertainty sets referred to as positive.

If both the objective coefficients and right hand side are uncertain, we show that the stochasticity gap can be arbitrarily large even if the uncertainty set and the probability measure are both symmetric. However, we prove that the adaptability gap (ratio of optimal cost of the robust problem and the optimal cost of a multi-stage fully-adaptable problem) is at most four even if both the objective coefficients and the right hand side of the constraints are uncertain and belong to a symmetric uncertainty set. The bound holds for the class of positive uncertainty sets as well. Moreover, if the uncertainty set is a hypercube (special case of a symmetric set), the adaptability gap is one under an even more general model of uncertainty where the constraint coefficients are also uncertain.

A class of affine policies has been extensively studied in the literature for multistage optimization. We show that affine policies are with O(m^(1/2)) from optimal in multistage problems and the bound is tight, thus giving a tight characterization of this class of tractably computed policies. (joint work with Vineet Goyal, MIT)

Dimitris Bertsimas is currently the Boeing Professor of Operations Research and the codirector of the Operations Research Center at the Massachusetts Institute of Technology. He has received a BS in Electrical Engineering and Computer Science at the National Technical University of Athens, Greece in 1985, a MS in Operations Research at MIT in 1987, and a Ph.D in Applied Mathematics and Operations Research at MIT in 1988. Since 1988, he has been in the MIT faculty.

His research interests include optimization, stochastic systems, data mining, and their application. In recent years he has worked in robust optimization, health care and finance. He has co-authored more than 110 scientific papers and he has co-authored the following books: "Introduction to Linear Optimization'' (with J. Tsitsiklis, Athena Scientific and Dynamic Ideas, 2008), "Data, models and decisions'' (with R. Freund, Dynamic Ideas, 2004) and "Optimization over Integers'' (with R. Weismantel, Dynamic Ideas, 2005). He is currently department editor in Optimization for Management Science and former area editor in Operations Research in Financial Engineering. He has supervised 42 doctoral students and he is currently supervising 10 others.

He is a member of the National Academy of Engineering, and he has received numerous research awards including the Farkas prize (2008), the Erlang prize (1996), the SIAM prize in optimization (1996), the Bodossaki prize (1998) and the Presidential Young Investigator award (1991-1996).


Prof. Carlos Daganzo, UC Berkeley, September 17, 2009

Adaptive Control of Urban Tansportation Systems

This seminar will discuss two recent advances in urban transportation system management using advanced technologies: (i) self-synchronizing public transportation routes, and (ii) adaptive congestion management schemes at the urban-scale through pricing and other forms of flow control.


Prof. L. Green, Columbia Business School, June 24, 2009

Using Operations Research to Reduce Delays for Healthcare

"Timeliness" has been identified as a key dimension of healthcare quality. Yet patient delays remain prevalent throughout healthcare systems resulting in dissatisfaction, adverse clinical consequences and often, higher costs. This talk will describe several examples of OR models that have been developed to help healthcare administrators manage healthcare resources more cost-effectively and reduce patient delays for care. Applications will include physician staffing in emergency rooms, inpatient nurse staffing, and patient demand management in outpatient practices.

Linda V. Green,  Armand G. Erpf Professor, earned her doctorate in Operations Research from Yale University. Her research, which has focused on the development and application of mathematical models of service systems, has resulted in dozens of publications in the major technical journals as well as prominent healthcare journals such as Health Services Research, Inquiry and Academic Emergency Medicine. Her current research focuses on improving the delivery of healthcare. Specific projects include reducing delays for emergency care, providing timely access to primary care, and the development of a new nurse staffing methodology. She is co-founder and co-director of the Columbia Alliance for Healthcare Management, a unique partnership of the College of Physicians and Surgeons, the Mailman School of Public Health and the Business School dedicated to promoting interdisciplinary research and education in healthcare management. She has been a consultant for both private and public sector organizations and has served on several advisory boards. She has served in many administrative positions at both Columbia and in the major professional societies and was the Department Editor for Public Sector Applications for the journal Management Science. In recognition of her professional achievements, Professor Green was elected a Fellow of INFORMS, the premier international professional society for operations research and management science


Prof. Sean Meyn (Chicago, Urbana Champaign), January 12, 2009

Dynamics of Prices in Electric Power Networks

The failure of deregulation in California is typically framed as the result of manipulation by ENRON, combined with high demand and scarce supply due to hot weather. Although it is likely that these factors were important, many questions remain. In particular, neither drought nor market manipulation are surprising to anyone living in California. Why didn't economists predict market failure?

High prices and price volatility are not unique to California. During the brief period of deregulation of electricity in Illinois in 1998, energy prices hit spikes of $6,000 per megawatt hour, about 100 times the prices seen before deregulation. Even today, prices at the Amsterdam Power Exchange in Europe show tremendous volatility.

Market designs and analyses are typically based on very simple static models that ignore some very important aspects of the problem. In particular, prices rise when capacity becomes low relative to demand. This leaves little incentive for generators to increase capacity.

The goal of this research is to explain price volatility by examining a dynamic model of power distribution. A dynamic flow model constructed for a single-consumer model in analogy with a standard stochastic queueing model. Surprisingly, in the absence of transmission constraints, it is shown that there is a unique equilibrium that is efficient in the economic sense. However, in this equilibrium power prices show extreme volatility, and high average prices for power, just as seen in California in 2001.

[1]http://black.csl.uiuc.edu/~meyn/pages/Market%202006/Market06.html

Sean P. Meyn, University of Illinois, USA, received the B.A. degree in Mathematics (Summa Cum Laude) from UCLA in 1982, and the PhD degree in Electrical Engineering from McGill University in 1987 (with Prof. P. Caines, McGill University). After a two year postdoctoral fellowship at the Australian National University in Canberra, he moved to the Midwest. He is now a Professor in the Department of Electrical and Computer Engineering, and a Research Professor in the Coordinated Science Laboratory at the University of Illinois. He is also an IEEE fellow. Dr. Meyn has served on the editorial boards of several journals in the systems and control, and applied probability areas. He was a University of Illinois Vice Chancellor's Teaching Scholar in 1994. He is coauthor with Richard Tweedie of the monograph Markov Chains and Stochastic Stability, and received jointly with Tweedie the 1994 ORSA/TIMS Best Publication In Applied Probability Award. His new book, Control Techniques for Complex Networks is published by Cambridge University Press. He has held visiting positions at universities all over the world. His research interests include stochastic processes, optimization, complex networks, and information theory.