Cindy Barnhart at CPAIOR

I am in Paris attending the CPAIOR (Constraint Programming/Artificial Intelligence/Operations Research) conference. I was the co-Program chair for this, which means my work is done, but now I get to see how good the papers we accepted are. On the whole, things are very good, with a surprise or two (each way!).

Cindy Barnhart of MIT was the plenary speaker this morning, talking about challenges and opportunities for OR in the airline industry, with an emphasis on plane and crew scheduling. Cindy has been working on airline applications for 20 years, so she has a wealth of experience to bring to this. She began with her view on what aspects of mathematical modeling are most useful in this application area. In her view, two key aspects are

  • Composite variables: defining variables to include complicated structures. For instance, instead of just having a variable for the number of planes on a particular leg to meet demand, the variable would be choices of combinations of planes, with the variable being 1 if that combination is used. While this leads to more variables, the resulting models are much easier to solve, since their linear relaxations are closer to being integer.
  • Implicit formulations: Instead of including all levels of variables, include only the highest level variable, but add constraints so that there is a feasible assignment for the other variables. For instance, instead of including all individual planes, only have a variable for the number of planes of a particular type. Add some constraints so that once the number of planes is known, individual planes can be feasibly assigned.

This lead to the question, then, “Why are things so bad”? Why is on-time service down, and why are people so angry at airlines (particularly in the US)? It is clear that planned schedules don’t correspond to actual. How can models help to close that gap? Cindy offered a wealth of models and insights:

  1. Creating robust schedules. The key idea in this model is to give more slack to planes more likely to be delayed. Surprisingly, even without changing the schedule, it is possible to reduce delays simply by the assignment of planes to legs: in essence, make sure planes coming in from San Francisco (where delays due to fog are common) are assigned outgoing legs a little later than an identically arriving plane from Phoenix. These delays can be reduced quite a bit more by slightly (up to 15 minutes) modifying the schedule.
  2. Auctioning for landing slots. Surprisingly in the US, at almost all airports there is no coordination among airlines in their schedules, so far too many planes arrive or leave in a short period, overwhelming the airport capacity. It seems obvious that slots should be auctioned off so that airport capacity is not violated.
  3. Dynamic Scheduling. Predicting demand on any particular leg on a particular day is clearly one that is approximate at best. As demand comes in, it is possible to change the capacity between cities by, for instance, changing a flight so that a formally illegal connection is now legal. Of course, this must be one-way: all previously purchased connections must remain legal. An airline could even “refleet” a flight, by changing the plane to a larger one or a smaller one to meet demand (in this case, the airline crew must still be legal for the flight, and these changes propagate through the system as the “wrong” planes continue through). Even minor changes (no more than a 15 minute move) can increase the profitability of a schedule by 2-4%, with most of the gain through schedule changes, not refleeting.
  4. Passenger-Centered recovery. In case of a “disruption” (a storm or other issue), airlines face the problem of getting planes, crews, and passengers back to normal as quickly as possible. Normally airlines treat the problem in that order: get the planes in the right place, then get the crews, then worry about the passengers. What if airlines combine all three and try to minimize average customer disruption while getting planes and crews back on schedule? It is possible to greatly reduce the average passenger delay but the airlines might have to delay a plane that is ready to go. So the delay of “undisrupted” customers will go up a bit in order to greatly reduce the delay for “disrupted” customers. That is going to be a hard step for an airline to do.

Nice talk, but it made me think that airlines are harder to work with than the groups I normally work with.

Pulleyblank Lecture at Georgia Tech

Bill Pulleyblank, Vice President, Center for Business Optimization, IBM (I have written about CBO before) gave a talk at Georgia Tech on April 17.   The title was “Computing, Business, and Operations Research: The Next Challenges”.  Here is the abstract:

 There have been two consistent drivers over the last sixty years of the evolution of computing: Computer power and price/performance improve by a factor of two every eighteen months; the problems that we wish to solve require this growth in capability and more. We seem to be reaching inflection points with both of these drivers. High performance systems are turning to massive parallelism to continue the required growth in performance. New challenges are arising in business and industry that require the solution of fundamentally different problems as well as the development of new approaches to old problems. Moreover, the rapid growth of a global economy has given an unprecedented urgency to dealing with these changes. I will review these subjects and some approaches that are being applied, with varying degrees of success. In particular, I will discuss five technical problems that must be solved to enable us to successfully meet the business challenges that we will face in the future.

Bill was a plenary speaker at INFORMS Pittsburgh 2006,  and I thought his talk was one of the highlights of the conference.  Georgia Tech has made a video and his slides available.

Netherlands Railway Edelman summary

I just got out of the “reprise” of the winning Edelman prize work by Netherlands Railways, and it was very, very good.

If you have been to the Netherlands, they have a very nice way of handling their trains:  every route repeats every hour.  So if you want to go from Utrecht to Amsterdam, there are trains at 13, 25, 43 and 55 minutes after the hour, every hour.  Of course, the size of the train changes depending on the expected demand.

The Dutch have had the same schedule since 1970, but it was time for a change.  Demand has doubled in that period, and there was little opportunity to increase the infrastructure.  So could a better schedule reduce costs and improve service?  Of course, or it wouldn’t be an Edelman Prize winner!

There were three pieces to the system:  the timetabler, the crew assignment system, and the rolling stock assignment system.  The timetabler was very interesting.  You can formulate the problem as a mixed-integer program, but the cyclic nature of the hourly schedule requires the use of “mod” constraints.  They tried MIP for a while (and even called in the big gun Lex Schrijver to look at it:  I saw a talk by Lex a decade or more ago on work he did with Netherlands Railway on purchasing rail cars that is still one of the nicest applications talks I have ever seen), but eventually went over to constraint programming.  From the talk, it appears they used constraint programming primarily to find a feasible solution, and then did some local improvements, but I will have to wait for the paper to be sure what is being done.

The other problems were solved with specialized integer programming methods.

The best part of the talk was their discussion of the implementation, which is where the models really came through.  The plan was to have an important length of track doubled to coincide with the new schedule.  That doubling didn’t happen, so they had to reschedule the crew essentially at the last minute.

The results have been really impressive.  Crew costs are down 4% and rolling stock usage is down 6%, combining for an annual benefit of $75 million.  As is not uncommon in OR projects, the system is both cheaper and works better.  Punctuality is at an all-time high, and customer satisfaction is similarly improved.  The increase in punctuality allowed them to increase fares by an additional 2%, for further profits of $30 million.  Other than the fare increase, this looks like win-win:  better service, more convenient times, cheaper operating costs.

At the end there was a nice video clip of the Minister of Transport saying how important OR is.

A worthy Edelman winner!

Green Supply Chains

Environmental modeling has been an increasingly active area of OR over the past few years (see the greenOR blog for many examples). As companies strive to do good, either for economic or other reasons, they are thinking more about environmental impact in all that they do.

ILOG has just announced a “Carbon Footprint” extension to its supply chain modeling software (which in turn is based on the software from David Simchi-Levi’s previous company LogicTools). InfoWorld is one site that has picked up on this:

In an effort to aid companies as they struggle to balance profitability and environmental responsibility, vendors are rolling out increasingly sophisticated tools. Among those vendors is ILOG, which this week released a Carbon Footprint extension to its LogicNet Plus XE supply-chain application. This remarkable tool serves a valuable function: It’s designed to help companies evaluate the impact that various supply-chain network configurations and transportation strategies would have on their carbon footprint.

David Simchi-Levi is quoted regarding a case of a company trying to decide how many distribution centers to have:

Using LogicNet Plus XE with the Carbon Extension, the company cranked out various scenarios that involved adding between two and seven new distribution centers. Turns out that moving to four distribution centers would have resulted in the highest costs savings. Thus, a company that wasn’t thinking about green metrics might have gone with that option.

The company found, however, that by going up to six distribution centers, it would have slightly higher costs (1.6 percent), but it would reduce transportation distances by 20 percent and overall carbon emissions by 11 percent.

Those results might come as a surprise: How could adding six more energy-consuming distribution centers result in less carbon waste? The answer: With the six-center model, the company relies more on trains for transporting goods inbound than it does on trucks to ship products outbound. Trucks have a significantly higher environmental impact than trains, according to Simchi-Levi.

I don’t know whether  a 1.6% cost increase is worth an 11 percent reduction in carbon emissions, but having the information makes it possible to make an informed decision.

This reminds me of work being done in robust optimization, of various forms.  David Ryan of the University of Auckland spoke two years about his efforts to devise airline schedules that are less susceptible to delays (I wrote on this in 2006).  He found that a very small cost increase allowed a huge improvement in the robustness of the schedules.  Many of these supply chain optimization problems are relatively “flat” with many near-optimal solutions (cases where there is only one optimal solution with everything much worse tend to have obvious solutions).  In such a situation, “secondary” issues such as environmental impact, customer perception, or robustness to variation in data take on a higher importance.

Gene Woolsey and Consulting in Operations Research

Gene Woolsey is one of my favorite people in OR, though I have met him just a couple of times. Gene has very strong views on how OR should be taught, and he implements them at the Colorado School of Mines. A key aspect of his approach is that OR is about doing and solving problems. And no problem can be solved without spending significant amounts of time with the people currently doing the job. So if you think you have a stocking problem at a grocery store, most of us OR people would ask for data, create models, solve them, and send back the results, without ever setting foot in a store. Gene (or, more likely now, his students) would spend days working with the stock people in the grocery store, and gain a hands-on understanding of the real situation. More often than not, this allows Gene to understand what the real problem is, and to avoid wasting time on unneeded analysis.  I will confess I like this approach much more in the abstract than in practice. I worked for a while on US Postal System reorganization without spending much time at all in a postal sorting facility. That is not the way Gene would do it!

Gene writes a regular column in Interfaces, my favorite OR journal (make sure your organization or university subscribes if you are at all serious about OR) entitled “The Fifth Column”. In general, the column is about doing OR, particularly doing OR as a consultant. In the November, 2007 issue, he wrote on “How to Consult and Not Be Paid”. It is a great column that hits a bit close to home (most of my consulting seems to be of the unpaid nature). In short, Gene gets a call from a possible client. He comes up immediately with an insightful, clever, and very easy to implement solution. And then he blurts it out. The prospective client thanks him profusely, hangs up, and that is all there is. Learning not to blurt out solutions is a good lesson!

OR at P&G

Let me be late in the OR blogging game and note that there is a great article on operations research at Proctor and Gamble on bnet. It is wonderful advertising for our field, including phrases like “P&G’s Killer Apps in OR”. INFORMS was strongly involved in the article, with quotes from Past-President Brenda Dietrich and Executive Director Mark Doherty:

P&G, GE, Merrill Lynch, UPS — the list of Fortune 500 companies getting into the OR game is expanding, says Mark Doherty, executive director of the Hanover, MD-based Institute for Operations Research and Management Sciences (INFORMS), an OR think tank. “In the private sector, OR is the secret weapon that helps companies tackle complex problems in manufacturing, supply chain management, health care, and transportation,” he says. “In government, OR helps the military create and evaluate strategies. It also helps the Department of Homeland Security develop models of terrorist threats. That’s why OR is increasingly referred to as the ‘science of better.’”

Having sat in on a few too many board meetings, I think calling INFORMS an “OR think tank” is going a little far. But the article does project the very best vision of our field.

Check out another take on the article at Punk Rock Operations Research (and I thought I saw it on another OR blog, but it escapes me at the moment).

Edelman Finalists Announced

The finalists for the 2008 Franz Edelman Award for Achievement in Operations Research and the Management Sciences have been announced. The Edelman Awards are a big thing in OR. The prize is given to the best use of operations research in practice. Even getting to be a finalist is a lot of work: this is not just a matter of submitting a paper and seeing how it goes. The finalists have to work even harder. They need to prepare a highly professional presentation, with the best presentations getting the support of a firm’s very top management (one year, the South African defense department was a finalist, and President Mandela provided a letter of support). Each finalists is assigned a coach to help them prepare their presentation (the late Rick Rosenthal was proud of the role he played as a coach, and I think finalists working with him may have had a bit of an advantage). And the projects have to be real, not hypothetical. Without verifiable and significant effect, a project cannot be a finalist, let alone a winner.

INFORMS has jazzed up the competition quite a bit in the last years, with fancy presentations at the Practice Meeting. I think this is great: these projects save, often, hundreds of millions of dollars, or improve many lives. They deserve a celebration.

This year’s finalists are an interesting bunch:

1. Federal Aviation Administration, for a project entitled “Airspace Flow Programs,” which gives the FAA greater ability to control the nation’s skies at times of peak consumer usage and flight congestion.

2. Netherlands Railways, for “The New Dutch Timetable: The O.R. Revolution,” a solution that improved on-time performance and capacity for more than a million daily train passengers.

3. StatoilHydro, one of the world’s largest gas producers, and Gassco, the independent Norwegian network operator, for “Optimizing the Offshore Pipeline System for Natural Gas in the North Sea.”

4. The City of Stockholm, Sweden for “Operations Research (O.R.) Improves Quality and Efficiency in Social Care and Home Help,” a program that has brought improvements to the complex scheduling of more than 4,000 providers who help the sick and the elderly.

5. U.S. Environmental Protection Agency, for “Reducing Security Risks in American Drinking Water Systems.”

6. Xerox, for “LDP Lean Document Production® – Dramatic Productivity Improvements for the Printing Industry,” which has bettered production and reduced costs for print shops and document manufacturers. The total impact to date on Xerox profits from the utilization of the LDP is about $200M. Xerox has filed 48 patents on this methodology and so far 11 have issued.

That’s three international organizations and only one traditional manufacturing finalist. The Swedish finalist in particular represents a strong trend in OR: using OR in the service sector.

I am looking forward to seeing the presentations at the Practice Meeting.

Andy Boyd, Pricing, and “The Engines of Our Ingenuity”

Andy Boyd, formerly chief scientist of PROS (he is actually still on their scientific board, but is not an active employee) visited CMU today as part of our CART (Center for Analytical Research in Technology) seminar series. He talked about the challenges those in pricing research face. The main point he made is that it is very difficult to figure out demand curves (hence elasticity of demand) through data. Having even lots and lots of transaction level data doesn’t help much in generating demand curves. This is not a new insight (economists refer to these sorts of issues as “identification problems”) but it was interesting to hear this from someone who has made a living doing pricing for the last decade. Without demand curves, how can pricing be done? Airlines have enough separate flights (for which you can assume no substitution) to do a fair amount of experimentation. How can other areas get similar data? Further, Andy makes the point that without understanding the sales process, it is impossible to interpret any data. For instance, for a given kind of car, there will be a few sales at a low value, lots of sales at a medium value, and a few sales at a high value. This does not mean that the demand for cars goes up then down as a function of price! Since car prices are generally negotiated, only a few of the best negotiators will get the lowest price.

Andy makes a strong case that operations research needs to be applied more in the entire sales domain, from customer segmentation through pricing to negotiation. The lack of underpinning in something as fundamental as a demand curve is a little disconcerting, but he stressed for many markets (those without “posted prices”), demand curves may be the wrong modeling approach in the first place.

Andy is now “semi-retired” (I guess he did well when PROS went public) but certainly seems to have lots going on. Once a week, he does a radio show on the Houston public radio radio station. The show is called Engines of Our Ingenuity and Andy does his version on Thursdays. The transcripts are available for the shows. Andy is normally referred to as “guest scientist” but he is sometimes called “operations researcher”, which makes me happy. A recent show of his was on operations research legend George Dantzig, concentrating on his development of the simplex algorithm and his lack of Nobel Prize. Other episodes involve the four color theorem, mathematical models, parallel computing, and operations research itself, along with much, much more. John Lienard is the driving force behind The Engines of Our Ingenuity.

Also, Andy has a new book out on pricing entitled The Future of Pricing: How Airline Ticket Pricing has Inspired a Revolution. Andy and I go back more than twenty years: it was great to see him and see all the amazing things he is doing, even if he is “semi-retired”.

Operations Research to Decide the Election?

Brian Borchers wrote me to comment on an article in the New York Times on how the US primaries are moving into a phase characterized by complicated resource allocation problems.  Up until now, it was easy:  candidates could spend their time in Iowa and New Hampshire (then Nevada and South Carolina) and not feel overstretched.   But with those primaries and caucuses now past, it gets more complicated.  Tuesday February 5 is “Super Tuesday” when 24 states are to choose their delegates to the national convention.  No candidate can reasonably campaign in all parts in all of these states.  But the rules on how delegates are chose make it necessary:

…the delegate rules for Democrats and for Republicans are different and, within each party, often vary from state to state. For example, the Republicans have some states where the statewide winner gets all the delegates, providing an obvious target for a candidate who might seem strong there. Among them are Missouri, New Jersey, New York and Utah.

But there are other states where the delegates are allocated by Congressional district, sometimes winner-take-all, and sometimes proportionally.

By contrast, Democrats eliminated the so-called winner-take-all rules. Instead, delegates are allocated depending on the percentage of vote each candidate gets in a Congressional district, under very expansive rules that, generally speaking, mean the candidates divide the trove evenly assuming they get more than 30 percent of the vote. There are also some delegates allocated statewide, again proportionately.

As Brian summarizes:

The various political campaigns have been building statistical models to predict voting outcomes in different congressional districts and using these as the basis for game theoretic decisions about how best to spend their limited funds and limited candidate time. The more traditional polling approach isn’t adequate, because it would be too expensive to do separate polls for every congressional district…

This is an interesting game involving things like parity (if there are 2 delegates for a district, then it is enough to get 34% of the vote to get 1;  with 3 candidates, 51% can earn you 2 delegates), resource allocation, timing (Rudy Giuliani chose to skip the initial rounds to concentrate on Florida and the Super Tuesday states, perhaps losing too much “momentum” in the process) and so on.  Since it has been a while since both the Democratic and Republican races have been this open, this should spur interest in this sort of modeling.  Punk Rock Operations Research has a posting on forecasting and polling, but that should only be the data for making better decisions. I would love to hear of any real operations research being used by the campaigns.

Dash Optimization is acquired by Fair Isaac

Dash Optimization, makers of Xpress-MP (one of the two leading linear and integer programming solvers, along with ILOG’s CPLEX), has been acquired by Fair Isaac. Fair Isaac is an anaytical application company, known best for their credit rating systems (they do the FICO scores that companies use to determine whether to extend credit). This is an interesting move. On one hand, we have a premier OR company being acquired by a company that is not particularly known in the OR world (despite their analytical focus). Will Xpress-MP be forgotten, particularly as the credit market is troubled? Will Xpress-MP continue to be a available and supported for the broader world?

On the other hand, a very well-known company now has operations research software as a key aspect of their offerings. Will they be able to leverage this to extend the reach of OR to more firms? One nice line of their press release:

We will showcase the Dash products and introduce our clients to these capabilities at the forthcoming InterACT conferences in Vienna and San Francisco.

Having OR methods showcased by Fair Isaac could be a huge boon to the field. If companies think “Fair Isaac is investing in OR, maybe we should find out more about the field”, that would be great! But one potentially worrisome aspect of the press release:

With Dash part of Fair Isaac, we can do more to:

  • Incorporate an additional layer of analytic power into our industry-standard solutions
  • Create advanced custom solutions based on specific client needs
  • Develop innovative new solutions for large and complex business problems

How much will Fair Isaac keep within itself rather than provide core optimization products to the wider world?  The longer press release calms fears about the role Dash software will play:

Fair Isaac’s acquisition of Dash builds upon a longstanding partnership between the two firms.  Dash optimization technology is currently embedded in Fair Isaac’s Decision Optimizer, a software tool for achieving the smartest decision strategies given operational complexities, resource constraints and market uncertainties.

“Demand for sophisticated decision management tools is growing rapidly, and the addition of Dash optimization technology to our portfolio helps us to remain the market leader,” said Mark Greene, CEO of Fair Isaac.  “With their optimization capabilities and our own business rules management and predictive analytic solutions, Fair Isaac now has the industry’s most comprehensive decision management suite.”

Being part of a “decision management suite” seems good!

There is a discussion of this on the USENET group sci.opresearch. In the discussion, Bob Daniels, one of the founders of Dash Optimization, talks about the change:

The good news for Xpress-MP and its users is that ALL the Dash employees are
moving enthusiastically to Fair Isaac and the development budget is
considerably enhanced.

For those of you who don’t know, the name “Dash” comes from (Bob) Daniel and
(Robert) ASHford. Robert and I, and the team we have recruited, have been
developing Xpress-MP for over two decades and we weren’t about to sell our
“baby” to people who weren’t dedicated to Math Programming and Xpress-MP. Of
course, we the founders have moved to Fair Isaac too.

Best wishes to Bob Daniels and the rest of the Dash Optimization team in making this transition from a stand-alone company to part of a much bigger operation. And let’s hope that Xpress-MP continues to get better and better (and not just for the use of Fair Isaac!).